PGLearn.E2ELRReserveScaler
— TypeE2ELRReserveScaler
Samples reserve requirements following the procedure below:
- Sample a minimum reserve requirement
MRR
from a uniform distributionU(lb, ub)
(mrr_dist
). - Compute the upper bound of reserve requirements for each generator as
rmax = α * (pmax - pmin)
. - Fix the lower bound of reserve requirement per generator to zero.
- Fix the reserve cost of each generator to zero.
The parameter α
is a scaling factor that determines each generator's maximum reserve capacity. It is the factor
parameter times the ratio of the largest generator's capacity to the sum of all generators' dispatchable capacity.
PGLearn.Glocal
— TypeGlocal{G,L}
A glocal distribution with global/local factors α::G
and η::L
.
This distribution represents a random variable of the form ϵ = α×η
, where
α
is a scalar random variable, with distributiond_α::G
η
is a vector random variable, with distributiond_η::L
α
andη
are independent random variables
PGLearn.LoadScaler
— TypeLoadScaler{D}
Scales loads with multiplicative noise sampled from d::D
.
The distribution d::D
is a 2*L
-dimensional distribution. The sampled active (resp reactive) demand for load $i$ is denoted by $\tilde{p}_{i}$ (resp. $\tilde{q}_{i}$) and has the form
\[\tilde{p}_{i} = \epsilon_{i} \bar{p}_{i}, \quad \tilde{q}_{i} = \epsilon_{i+L} \bar{q}_{i},\]
where $\bar{p}_{i}, \bar{q}_{i}$ are the reference active/reactive demand for load $i$, and $\epsilon \in \mathbb{R}^{2L}$ is multiplicative noise sampled from distribution d::D
.
PGLearn.Nminus1StatusSampler
— TypeNminus1StatusSampler
Samples a branch or generator to be inactive following the procedure below:
- With probability 1/2, decide if a branch or generator is inactive in this instance.
If a branch is inactive:
- Identify the set of branches which are not bridges. Removing a bridge results in a disconnected network.
- Sample a branch to be inactive from the set of non-bridge branches.
- Construct the status vector to be all ones except for zero at the sampled branch.
If a generator is inactive:
- Sample a generator to be inactive.
- Construct the status vector to be all ones except for zero at the sampled generator.
PGLearn.OPFData
— MethodOPFData(network::Dict{String,Any})
Convert a PowerModels data dictionary to OPFData
structure.
The PowerModels data dictionary must be in basic format.
PGLearn.ScaledLogNormal
— MethodScaledLogNormal(l, u, σs)
Generate a Glocal
distribution ϵ = α×η
where α~U[l,u]
and ηᵢ ~ LogNormal(-σᵢ²/2, σᵢ)
.
PGLearn.ScaledUniform
— MethodScaledUniform(l, u, σs)
Generate a Glocal
distribution ϵ = α×η
where α ~ U[l,u]
and ηᵢ ~ U[1-σᵢ, 1+σᵢ]
.
PGLearn._dedupe_and_sort_h5!
— Method_dedupe_and_sort_h5!(D)
De-duplicated and sort dataset D
in increasing order of random seeds.
Equivalent to _dedupe_h5!(D); _sort_h5!(D)
, but more efficient.
PGLearn._dedupe_h5!
— Method_dedupe_h5!(D)
De-duplicate points in h5 dataset D
, according to their random seed.
PGLearn._get_case_info
— Method_get_case_info(config)
Extract case file and name from input config
.
To be valid, the input config should include:
- either a
case_file
orpglib_case
entry - if no
case_file
is provided,pglib_case
should be a valid, unique PGLib case name.
The case name will be set to the generic "case" value if none is provided.
if case_file
is provided, pglib_case
will be ignored. Therefore, users should provide case_name
when supplying case_file
.
PGLearn._merge_h5
— Method_merge_h5(V::Vector{Array{T,N})
Concatenate a collection of N
-dimensional arrays along their last dimension.
This function is semantically equivalent to cat(V...; dims=ndims(first(V)))
, but uses a more efficient, splatting-free, implementation. All elements of V
must have the same size in the first N-1
dimensions.
PGLearn._select_h5!
— Method_select_h5!(D, p)
Select data points in D
as indicated by p
.
D
should be a dictionary in h5-compatible format, and p
is either a vector of indices, or a logical vector of the same length as D["meta"]["seed"]
.
- If
p
is a vector of indices, then all values ofp
should be integers between1
and the number of elements inD
- If
p
is a logical vector, then it should have the same length asD["meta"]["seed"]
. Only datapointsi
for whichp[i]
istrue
are selected.
PGLearn._sort_h5!
— Method_sort_h5!(D)
Sort dataset D
in increasing order of random seeds.
The dictionary D
should be in h5-compatible format. It is modified in-place.
The function expects D["meta"]["seed"]
to exist and be a Vector{Int}
. An error is thrown if such an entry is not found.
PGLearn.bridges
— Methodbridges(data)
Identify whether each branch is a bridge.
The input data must be in basic format.
A branch is a bridge if removing it renders the network disconnected. Returns a dictionary res::Dict{String,Bool}
such that res[br]
is true if branch br
is a bridge, and false
otherwise.
PGLearn.build_opf
— Methodbuild_opf(ACOPF, data, optimizer)
Build an ACOPF model.
PGLearn.build_opf
— Methodbuild_opf(DCOPF, data, optimizer)
Build a DCOPF model.
PGLearn.build_opf
— Methodbuild_opf(SDPOPF, data, optimizer)
Build an SDPOPF model.
PGLearn.build_opf
— Methodbuild_opf(SOCOPF, data, optimizer)
Build an SOCOPF model.
PGLearn.compute_flow!
— Methodcompute_flow!(pf, pg, Φ::FullPTDF)
Compute power flow pf = Φ*pg
given PTDF matrix Φ
and nodal injections pg
.
PGLearn.compute_flow!
— Methodcompute_flow!(pf, pg, Φ::LazyPTDF)
Compute power flow pf = Φ*pg
lazily, without forming the PTDF matrix.
Namely, pf
is computed as pf = BA * (F \ pg)
, where F
is an LDLᵀ factorization of AᵀBA
.
PGLearn.compute_voltage_phasor_bounds
— Methodcompute_voltage_phasor_bounds(vfmin, vfmax, vtmin, vtmax, dvamin, dvamax)
Compute lower/upper bounds on wr/wi variables.
PGLearn.convert_float_data
— Methodconvert_float_data(D, F)
Convert all floating-point scalars and arrays to F
.
Arguments
D
: Should be a JSON-serializable dictionary, which generally means that all keys areString
and all values are JSON-compatible.F
: Must be a subtype ofAbstractFloat
Returns
d::Dict{String,Any}
: a dictionary with same nested structure asD
, with all floating-point scalars and arrays converted toF
.
PGLearn.load_h5
— Functionload_h5
PGLearn.load_json
— Methodload_json(filename::AbstractString)
Load JSON data from file filename
.
PGLearn.ptdf_row
— Methodptdf_row(Φ::FullPTDF, e::Int)
Return the e
-th row of PTDF matrix Φ
.
PGLearn.ptdf_row
— Methodptdf_row(Φ::LazyPTDF, e::Int)
Return the e
-th row of (lazy) PTDF matrix Φ
.
PGLearn.save_h5
— Methodsave_h5(filename, D; warn=true)
Saves dictionary D
to HDF5 file filename
.
Arguments
filename::AbstractString
: Path to the HDF5 file; must be a valid path.D
: Dictionary to save to the file. All keys inD
must be ofString
type, and it must be HDF5-compatible. Additional restrictions are enforced on the values ofD
, see below.warn::Bool=true
: Whether to raise a warning when converting numerical data.
Only the following types are supported:
String
- (un)signed integers up to 64-bit precision
Float32
andFloat64
Complex
versions of the above numeric types- Dense
Array
s of the the above scalar types
Numerical data whose type is not listed above will be converted to Float64
, which may incur a loss of precision. A warning will be displayed if this happens unless warn
is set to false
. If conversion to Float64
is not possible, an error is thrown.
PGLearn.save_json
— Methodsave_json(filename::AbstractString, data; indent)
Save data
into JSON file filename
. The following formats are supported:
- uncompressed JSON
.json
- Gzip-compressed JSON
.json.gz
- Bzip2-compressed JSON
.json.bz2
If the file extension does not match one of the above, an error is thrown.
PGLearn.tensorize
— Methodtensorize(V)
Concatenate elements of V
into a higher-dimensional tensor.
Similar to Base.stack
, with one major difference: if V
is a vector of scalars, the result is a 2D array M
whose last dimension is length(V)
, and such that M[:, i] == V[i]
.
This function is only defined for Vector{T}
and Vector{Array{T,N}}
inputs, to avoid any unexpected behavior of Base.stack
.
Random.rand!
— Methodrand!(rng::AbstractRNG, s::AbstractOPFSampler, data::Dict)
Sample one new OPF instance and modify data
in-place.
data
must be a Dict
in PowerModels format, representing the same network (i.e., same grid components with same indexing) as the one used to create s
.