PGLearn.E2ELRReserveScaler — Type
E2ELRReserveScalerSamples reserve requirements following the procedure below:
- Sample a minimum reserve requirement
MRRfrom a uniform distributionU(lb, ub)(mrr_dist). - Compute the upper bound of reserve requirements for each generator as
rmax = α * (pmax - pmin). - Fix the lower bound of reserve requirement per generator to zero.
- Fix the reserve cost of each generator to zero.
The parameter α is a scaling factor that determines each generator's maximum reserve capacity. It is the factor parameter times the ratio of the largest generator's capacity to the sum of all generators' dispatchable capacity.
PGLearn.Glocal — Type
Glocal{G,L}A glocal distribution with global/local factors α::G and η::L.
This distribution represents a random variable of the form ϵ = α×η, where
αis a scalar random variable, with distributiond_α::Gηis a vector random variable, with distributiond_η::Lαandηare independent random variables
PGLearn.LoadScaler — Type
LoadScaler{D}Scales loads with multiplicative noise sampled from d::D.
The distribution d::D is a 2*L-dimensional distribution. The sampled active (resp reactive) demand for load $i$ is denoted by $\tilde{p}_{i}$ (resp. $\tilde{q}_{i}$) and has the form
\[\tilde{p}_{i} = \epsilon_{i} \bar{p}_{i}, \quad \tilde{q}_{i} = \epsilon_{i+L} \bar{q}_{i},\]
where $\bar{p}_{i}, \bar{q}_{i}$ are the reference active/reactive demand for load $i$, and $\epsilon \in \mathbb{R}^{2L}$ is multiplicative noise sampled from distribution d::D.
PGLearn.Nminus1StatusSampler — Type
Nminus1StatusSamplerSamples a branch or generator to be inactive following the procedure below:
- With probability 1/2, decide if a branch or generator is inactive in this instance.
If a branch is inactive:
- Identify the set of branches which are not bridges. Removing a bridge results in a disconnected network.
- Sample a branch to be inactive from the set of non-bridge branches.
- Construct the status vector to be all ones except for zero at the sampled branch.
If a generator is inactive:
- Sample a generator to be inactive.
- Construct the status vector to be all ones except for zero at the sampled generator.
PGLearn.OPFData — Method
OPFData(network::Dict{String,Any}; compute_clique_decomposition::Bool=false)Convert a PowerModels data dictionary to OPFData structure.
The PowerModels data dictionary must be in basic format.
PGLearn.ScaledLogNormal — Method
ScaledLogNormal(l, u, σs)Generate a Glocal distribution ϵ = α×η where α~U[l,u] and ηᵢ ~ LogNormal(-σᵢ²/2, σᵢ).
PGLearn.ScaledUniform — Method
ScaledUniform(l, u, σs)Generate a Glocal distribution ϵ = α×η where α ~ U[l,u] and ηᵢ ~ U[1-σᵢ, 1+σᵢ].
PGLearn._dedupe_and_sort_h5! — Method
_dedupe_and_sort_h5!(D)De-duplicated and sort dataset D in increasing order of random seeds.
Equivalent to _dedupe_h5!(D); _sort_h5!(D), but more efficient.
PGLearn._dedupe_h5! — Method
_dedupe_h5!(D)De-duplicate points in h5 dataset D, according to their random seed.
PGLearn._get_case_info — Method
_get_case_info(config)Extract case file and name from input config.
To be valid, the input config should include:
- either a
case_fileorpglib_caseentry - if no
case_fileis provided,pglib_caseshould be a valid, unique PGLib case name.
The case name will be set to the generic "case" value if none is provided.
PGLearn._get_clique_decomposition — Method
_get_clique_decomposition(network::Dict{String,Any})Compute the clique decomposition of a PowerModels data dictionary. The output is a Vector{Vector{Int}} containing all the cliques in the chordal completion of network.
PGLearn._get_overlapping_pairs — Method
_get_overlapping_pairs(groups)Get the indices of pairs of cliques in groups that overlap.
Refer to https://github.com/lanl-ansi/PowerModels.jl/blob/be6af59202a6868b20a41214cb341b883d62e5f0/src/form/wrm.jl#L273-L274
PGLearn._merge_h5 — Method
_merge_h5(V::Vector{Array{T,N})Concatenate a collection of N-dimensional arrays along their last dimension.
This function is semantically equivalent to cat(V...; dims=ndims(first(V))), but uses a more efficient, splatting-free, implementation. All elements of V must have the same size in the first N-1 dimensions.
PGLearn._overlap_indices — Function
idx_a, idx_b = _overlap_indices(A, B)Given two arrays (sizes need not match) that share some values, return:
- linear index of shared values in A
- linear index of shared values in B
Thus, A[idxa] == B[idxb].
Refer to https://github.com/lanl-ansi/PowerModels.jl/blob/be6af59202a6868b20a41214cb341b883d62e5f0/src/form/wrm.jl#L469
PGLearn._select_h5! — Method
_select_h5!(D, p)Select data points in D as indicated by p.
D should be a dictionary in h5-compatible format, and p is either a vector of indices, or a logical vector of the same length as D["meta"]["seed"].
- If
pis a vector of indices, then all values ofpshould be integers between1and the number of elements inD - If
pis a logical vector, then it should have the same length asD["meta"]["seed"]. Only datapointsifor whichp[i]istrueare selected.
PGLearn._sort_h5! — Method
_sort_h5!(D)Sort dataset D in increasing order of random seeds.
The dictionary D should be in h5-compatible format. It is modified in-place.
The function expects D["meta"]["seed"] to exist and be a Vector{Int}. An error is thrown if such an entry is not found.
PGLearn.bridges — Method
bridges(data)Identify whether each branch is a bridge.
The input data must be in basic format.
A branch is a bridge if removing it renders the network disconnected. Returns a dictionary res::Dict{String,Bool} such that res[br] is true if branch br is a bridge, and false otherwise.
PGLearn.build_opf — Method
build_opf(ACOPF, data, optimizer)Build an ACOPF model.
PGLearn.build_opf — Method
build_opf(DCOPF, data, optimizer)Build a DCOPF model.
PGLearn.build_opf — Method
build_opf(SDPOPF, data, optimizer)Build an SDPOPF model.
PGLearn.build_opf — Method
build_opf(SparseSDPOPF, data, optimizer)Build an SparseSDPOPF model.
PGLearn.build_opf — Method
build_opf(SOCOPF, data, optimizer)Build an SOCOPF model.
PGLearn.compute_flow! — Method
compute_flow!(pf, pg, Φ::FullPTDF)Compute power flow pf = Φ*pg given PTDF matrix Φ and nodal injections pg.
PGLearn.compute_flow! — Method
compute_flow!(pf, pg, Φ::LazyPTDF)Compute power flow pf = Φ*pg lazily, without forming the PTDF matrix.
Namely, pf is computed as pf = BA * (F \ pg), where F is an LDLᵀ factorization of AᵀBA.
PGLearn.compute_voltage_phasor_bounds — Method
compute_voltage_phasor_bounds(vfmin, vfmax, vtmin, vtmax, dvamin, dvamax)Compute lower/upper bounds on wr/wi variables.
PGLearn.convert_float_data — Method
convert_float_data(D, F)Convert all floating-point scalars and arrays to F.
Arguments
D: Should be a JSON-serializable dictionary, which generally means that all keys areStringand all values are JSON-compatible.F: Must be a subtype ofAbstractFloat
Returns
d::Dict{String,Any}: a dictionary with same nested structure asD, with all floating-point scalars and arrays converted toF.
PGLearn.load_h5 — Function
load_h5PGLearn.load_json — Method
load_json(filename::AbstractString)Load JSON data from file filename.
PGLearn.ptdf_row — Method
ptdf_row(Φ::FullPTDF, e::Int)Return the e-th row of PTDF matrix Φ.
PGLearn.ptdf_row — Method
ptdf_row(Φ::LazyPTDF, e::Int)Return the e-th row of (lazy) PTDF matrix Φ.
PGLearn.save_h5 — Method
save_h5(filename, D; warn=true)Saves dictionary D to HDF5 file filename.
Arguments
filename::AbstractString: Path to the HDF5 file; must be a valid path.D: Dictionary to save to the file. All keys inDmust be ofStringtype, and it must be HDF5-compatible. Additional restrictions are enforced on the values ofD, see below.warn::Bool=true: Whether to raise a warning when converting numerical data.
Only the following types are supported:
String- (un)signed integers up to 64-bit precision
Float32andFloat64Complexversions of the above numeric types- Dense
Arrays of the the above scalar types
Numerical data whose type is not listed above will be converted to Float64, which may incur a loss of precision. A warning will be displayed if this happens unless warn is set to false. If conversion to Float64 is not possible, an error is thrown.
PGLearn.save_json — Method
save_json(filename::AbstractString, data; indent)Save data into JSON file filename. The following formats are supported:
- uncompressed JSON
.json - Gzip-compressed JSON
.json.gz - Bzip2-compressed JSON
.json.bz2
If the file extension does not match one of the above, an error is thrown.
PGLearn.tensorize — Method
tensorize(V)Concatenate elements of V into a higher-dimensional tensor.
Similar to Base.stack, with one major difference: if V is a vector of scalars, the result is a 2D array M whose last dimension is length(V), and such that M[:, i] == V[i].
This function is only defined for Vector{T} and Vector{Array{T,N}} inputs, to avoid any unexpected behavior of Base.stack.
Random.rand! — Method
rand!(rng::AbstractRNG, s::AbstractOPFSampler, data::OPFData)Sample one new OPF instance and modify data in-place.
data must be the same network (i.e., same grid components with same indexing) as the one used to create s.