forked from 3rdparty/wrf-python
20 changed files with 1883 additions and 32 deletions
@ -0,0 +1,182 @@
@@ -0,0 +1,182 @@
|
||||
# Contributor Code of Conduct |
||||
|
||||
## Related Code of Conduct |
||||
|
||||
Participant Code of Conduct |
||||
[https://www2.fin.ucar.edu/ethics/participant-code-conduct](https://www2.fin.ucar.edu/ethics/participant-code-conduct) |
||||
|
||||
## Our Pledge |
||||
|
||||
We, as contributors and maintainers (participants), of WRF-Python pledge to |
||||
make participation in our software project and community a safe, productive, |
||||
welcoming and inclusive experience for everyone. All participants are required |
||||
to abide by this Code of Conduct. This includes respectful treatment of |
||||
everyone regardless of age, body size, disability, ethnicity, gender identity |
||||
or expression, level of experience, nationality, political affiliation, |
||||
veteran status, pregnancy, genetic information, physical appearance, race, |
||||
religion, or sexual orientation, as well as any other characteristic protected |
||||
under applicable US federal or state law. |
||||
|
||||
## Our Standards |
||||
|
||||
Examples of behaviors that contribute to a positive environment include: |
||||
|
||||
* Using welcoming and inclusive language |
||||
* Respectful when offering and gracefully accepting constructive criticism |
||||
* Acknowledging the contributions of others |
||||
* Focusing on what is best for the community |
||||
* Showing empathy towards other community members |
||||
* Treating everyone with respect and consideration, valuing a diversity of |
||||
views and opinions |
||||
* Communicating openly with respect for others, critiquing ideas rather than |
||||
individuals |
||||
|
||||
Examples of unacceptable behavior include, but are not limited to: |
||||
|
||||
* Harassment, intimidation, or discrimination in any form |
||||
* Personal attacks directed toward other participants |
||||
* Unwelcome sexual attention or advances |
||||
* Inappropriate, negative, derogatory comments and/or attacks on personal |
||||
beliefs |
||||
* Publishing others' private information, such as a physical or electronic |
||||
address, without explicit permission |
||||
* Refusing to use the pronouns that someone requests |
||||
* Alarming, intimidating, threatening, or hostile comments or conduct |
||||
* Physical or verbal abuse by anyone to anyone, including but not limited to a |
||||
participant, member of the public, guest, member of any institution or |
||||
sponsor |
||||
* Comments related to characteristics given in the pledge at the top |
||||
* Inappropriate use of nudity and/or sexual images |
||||
* Threatening or stalking other participants |
||||
* Other conduct which could reasonably be considered inappropriate in a |
||||
professional setting |
||||
|
||||
## Scope |
||||
|
||||
This Code of Conduct applies to all spaces managed by the Project whether it |
||||
be online or face-to-face. This includes project code, code repository, |
||||
associated web pages, documentation, mailing lists, project websites and |
||||
wiki pages, issue tracker, meetings, telecons, events, project social media |
||||
accounts, and any other forums created by the project team which the community |
||||
uses for communication. In addition, violations of this Code of Conduct |
||||
outside these spaces may affect a person's ability to participate within them. |
||||
Representation of a project may be further defined and clarified by project |
||||
maintainers. |
||||
|
||||
## Community Responsibilities |
||||
|
||||
Everyone in the community is empowered to respond to people who are showing |
||||
unacceptable behavior. They can talk to them privately or publicly. Anyone |
||||
requested to stop unacceptable behavior is expected to comply immediately. |
||||
If the behavior continues concerns may be brought to the project |
||||
administrators or to any other party listed in the Reporting section below. |
||||
|
||||
## Project Administrator Responsibilities |
||||
|
||||
Project Administrators are responsible for clarifying the standards of |
||||
acceptable behavior and are encouraged to model appropriate behavior and |
||||
provide support when people in the community point out inappropriate behavior. |
||||
Project administrator(s) are normally the ones that would be tasked to carry |
||||
out the actions in the Consequences section below. |
||||
|
||||
Project Administrators are also expected to keep this Code of Conduct updated |
||||
with the main one housed at UCAR as listed below in the Attribution section. |
||||
|
||||
## Reporting |
||||
|
||||
Instances of unacceptable behavior can be brought to the attention of the |
||||
project administrator(s) who may take any action as outlined in the |
||||
Consequences section below. However, making a report to a project |
||||
administrator is not considered an 'official report' to UCAR. |
||||
|
||||
Instances of unacceptable behavior may also be reported directly to UCAR via |
||||
UCAR's Harassment Reporting and Complaint Procedure at [https://www2.fin.ucar.edu/procedures/hr/harassment-reporting-and-complaint-procedure](https://www2.fin.ucar.edu/procedures/hr/harassment-reporting-and-complaint-procedure), |
||||
or anonymously through UCAR's EthicsPoint Hotline at [https://www2.fin.ucar.edu/ethics/anonymous-reporting](https://www2.fin.ucar.edu/ethics/anonymous-reporting). |
||||
|
||||
Complaints received by UCAR will be handled pursuant to the procedures |
||||
outlined in UCAR's Harassment Reporting and Complaint Procedure. Complaints |
||||
to UCAR will be held as confidential as practicable under the circumstances, |
||||
and retaliation against a person who initiates a complaint or an inquiry about |
||||
inappropriate behavior will not be tolerated. |
||||
|
||||
Any Contributor can use these reporting methods even if they are not directly |
||||
affiliated with UCAR. The Frequently Asked Questions (FAQ) page for reporting |
||||
is here: [https://www2.fin.ucar.edu/procedures/hr/reporting-faqs](https://www2.fin.ucar.edu/procedures/hr/reporting-faqs). |
||||
|
||||
## Consequences |
||||
|
||||
Upon receipt of a complaint, the project administrator(s) may take any action |
||||
deemed necessary and appropriate under the circumstances. Such action can |
||||
include things such as: removing, editing, or rejecting comments, commits, |
||||
code, wiki edits, email, issues, and other contributions that are not aligned |
||||
to this Code of Conduct, or banning temporarily or permanently any contributor |
||||
for other behaviors that are deemed inappropriate, threatening, offensive, or |
||||
harmful. Project Administrators also have the right to report violations to |
||||
UCAR HR and/or UCAR's Office of Diversity, Equity and Inclusion (ODEI) as |
||||
well as a participant's home institution and/or law enforcement. In the event |
||||
an incident is reported to UCAR, UCAR will follow its Harassment Reporting |
||||
and Complaint Procedure. |
||||
|
||||
## Process for Changes |
||||
|
||||
All UCAR managed projects are required to adopt this Contributor Code of |
||||
Conduct. Adoption is assumed even if not expressly stated in the repository. |
||||
Projects should fill in sections where prompted with project-specific |
||||
information, including, project name, email addresses, adoption date, etc. |
||||
There is one section below marked "optional" that may not apply to a given |
||||
project. |
||||
|
||||
Projects that adopt this Code of Conduct need to stay up to date with |
||||
UCAR's Contributor Code of Conduct, linked with a DOI in the "Attribution" |
||||
section below. Projects can make limited substantive changes to the Code of |
||||
Conduct, however, the changes must be limited in scope and may not contradict |
||||
the UCAR Contributor Code of Conduct. |
||||
|
||||
## Attribution |
||||
|
||||
This Code of Conduct was originally adapted from the Contributor Covenant, |
||||
version 1.4, available at [Contributor-Covenant](http://contributor-covenant.org/version/1/4). |
||||
We then aligned it with the UCAR Participant Code of Conduct, which also |
||||
borrows from the American Geophysical Union (AGU) Code of Conduct. The UCAR |
||||
Participant Code of Conduct applies to both UCAR employees as well as |
||||
participants in activities run by UCAR. We modified the "scope" section with |
||||
the django project description, and we added "Publication Ethics" from |
||||
the NGEET/FATES project. The original version of this for all software |
||||
projects that have strong management from UCAR or UCAR staff is available |
||||
on the UCAR website at [*Enter DOI link name*] (the date that it was adopted |
||||
by this project was [*Enter date adopted*]). When responding to complaints |
||||
UCAR HR and ODEI will do so based on the latest published version. Therefore, |
||||
any project-specific changes should follow the Process for Changes section |
||||
above. |
||||
|
||||
## Publication Ethics (optional) |
||||
|
||||
We aim to create an open development environment where developers can be |
||||
confident that all members of the community are publishing any research |
||||
on the project in an ethical manner. In particular, writing code is a form of |
||||
intellectual contribution, and one should expect that all such intellectual |
||||
contributions are respected and given credit in any resulting published work. |
||||
To support the community and avoid issues of misconduct related to the above |
||||
principle, please respect the following rules: |
||||
|
||||
* Document the version of the code used in any publication, preferably by |
||||
either using a release tag (existing or newly created) if possible, or a |
||||
commit hash if not. |
||||
|
||||
* Do not use code from anywhere other than the central project's development |
||||
repository main development branch without discussing with the author(s) of |
||||
the modified code your intentions for using the code and receiving their |
||||
permission to do so. |
||||
|
||||
* When using project features that have recently been integrated into the |
||||
central Project development repository, be mindful of the contributions |
||||
of others and, where the novel features qualitatively affect the results, |
||||
involve the author(s) of these features in any resulting manuscripts. |
||||
Be particularly aware of the concerns of early career researchers, and |
||||
ensure they have sufficient time to lead publications using their |
||||
developments. |
||||
|
||||
* When discussing results arising from older project features that have been |
||||
described in the literature or releases, accurately cite the publications |
||||
describing those features or releases. |
||||
|
@ -0,0 +1,692 @@
@@ -0,0 +1,692 @@
|
||||
.. _internals: |
||||
|
||||
WRF-Python Internals |
||||
======================================== |
||||
|
||||
WRF-Python is a collection of diagnostic and interpolation routines for |
||||
WRF-ARW data. The API is kept to a minimal set of functions, since we've found |
||||
this to be the easiest to teach to new programmers, students, and scientists. |
||||
Future plans include adopting the Pangeo xarray/dask model, along with an |
||||
object oriented API, but this is not currently supported as of this user |
||||
guide. |
||||
|
||||
A typical use case for a WRF-Python user is to: |
||||
|
||||
1) Open a WRF data file (or sequence of files) using NetCDF4-python or PyNIO. |
||||
2) Compute a WRF diagnostic using :meth:`wrf.getvar`. |
||||
3) Perform any additional computations using methods outside of WRF-Python. |
||||
4) Create a plot of the output using matplotlib (basemap or cartopy) or |
||||
PyNGL. |
||||
|
||||
The purpose of this guide is to explain the internals of item (2) so that |
||||
users can help contribute or support the computational diagnostic |
||||
routines. |
||||
|
||||
|
||||
Overview of a :meth:`wrf.getvar` Diagnostic Computation |
||||
--------------------------------------------------------------- |
||||
|
||||
A diagnostic computed using the :meth:`wrf.getvar` function consists of the |
||||
following steps: |
||||
|
||||
1) Call the appropriate 'getter' function based on the specified diagnostic |
||||
label. This step occurs in the :meth:`wrf.getvar` routine in routines.py. |
||||
2) Extract the required variables from the NetCDF data file (or files). |
||||
3) Compute the diagnostic using a wrapped Fortran, C, or Python routine. |
||||
4) Convert to the desired units (if applicable). |
||||
5) Set the metadata (if desired) and return the result as an |
||||
:class:`xarray.DataArray`, or return a :class:`numpy.ndarray` if no |
||||
metadata is required. |
||||
|
||||
In the source directory, the :meth:`wrf.getvar` 'getter' routines have a |
||||
"\g\_" prefix for the naming convention (the "g" is for "get"). |
||||
|
||||
The unit conversion is handled by a :mod:`wrapt` decorator that can be found |
||||
in *decorators.py*. The setting of the metadata is handled using a :mod:`wrapt` |
||||
decorator, which can be found in the *metadecorators.py* file. |
||||
|
||||
|
||||
Overview of Compiled Computational Routines |
||||
--------------------------------------------------------- |
||||
|
||||
Currently, the compiled computational routines are written in Fortran |
||||
90 and exposed the Python using f2py. The routines have been aquired over |
||||
decades, originated from NCL's Fortran77 codebase, the WRF model itself, |
||||
or other tools like RIP (Read Interpolate Plot), and do not necessarily |
||||
conform to a common programming mindset (e.g. 1D arrays, 2D arrays, etc). |
||||
|
||||
The raw Fortran routines are compiled in to the :mod:`wrf._wrffortran` |
||||
extension module, but are not particularly useful for applications in their |
||||
raw form. These routines are imported in the *extension.py* module, where |
||||
additional functionality is added to make the routines more user friendly. |
||||
|
||||
The typical behavior for a fully exported Fortran routine in *extension.py* |
||||
is: |
||||
|
||||
1) Verify that the supplied arguments are valid in shape. Although f2py does |
||||
this as well, the errors thrown by f2py are confusing to users, so this |
||||
step helps provide better error messages. |
||||
|
||||
2) Allocate an ouput array based on the output shape of the algorithm, |
||||
number of "leftmost"[1]_ dimensions, and size of the data. |
||||
|
||||
3) Iterate over the leftmost [1]_ dimensions and compute output for argument |
||||
data slices that are of the same dimensionality as the compiled algorithm. |
||||
|
||||
4) Cast the argument arrays (or array slices) in to the dtype used in the |
||||
compiled routine. For WRF data, the conversion is usually from a 4-byte |
||||
float to an 8-byte double. |
||||
|
||||
5) Extract the argument arrays out of xarray in to numpy arrays |
||||
(if applicable) and transpose them in to Fortran ordering. Note that this |
||||
does not actually do any copying of the data, it simply reorders the shape |
||||
tuple for the data and sets the Fortran ordering flag. This allows data |
||||
pointers from the output array slices to be passed directly to the |
||||
Fortran routine, which eliminates the need to copy the result to the output |
||||
array. |
||||
|
||||
The steps described above are handled in :mod:`wrapt` decorators that can be |
||||
found in *decorators.py*. For some routines that produce multiple outputs or |
||||
have atypical behavior, the special case decorators are located in |
||||
*specialdec.py*. |
||||
|
||||
.. [1] If the Fortran algorithm is written for a 2-dimensional array, |
||||
and a users passes in a 5-dimensional array, there are 3 "leftmost" |
||||
dimensions. |
||||
|
||||
|
||||
Example |
||||
---------------------------- |
||||
|
||||
The above overviews are better explained by an example. Although there are a |
||||
few exceptions (e.g. ll_to_xy), many of the routines in WRF-Python behave this |
||||
way. |
||||
|
||||
For this example, let's make a routine that adds a variable's base state |
||||
to its perturbation. This is the kind of thing that you'd normally use numpy |
||||
for (e.g. Ptot = PB + P), but you could do this if you wanted concurrency |
||||
for this operation via OpenMP rather than using dask (in a future release of |
||||
WRF-Python, both OpenMP and dask will be available). |
||||
|
||||
|
||||
Fortran Code |
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
||||
|
||||
Below is the Fortran 90 code, which will be written to a file called |
||||
example.f90. |
||||
|
||||
.. code:: fortran |
||||
|
||||
SUBROUTINE pert_add(base, pert, total, nx, ny) |
||||
|
||||
!f2py threadsafe |
||||
!f2py intent(in,out) :: result |
||||
|
||||
REAL(KIND=8), INTENT(IN), DIMENSION(nx, ny) :: base, pert |
||||
REAL(KIND=8), INTENT(OUT), DIMENSION(nx, ny) :: total |
||||
INTEGER, INTENT(IN) :: nx, ny |
||||
|
||||
INTEGER :: i |
||||
|
||||
!$OMP PARALLEL DO COLLAPSE(2) SCHEDULE(runtime) |
||||
DO j=1, ny |
||||
DO i=1, nx |
||||
total(i, j) = base(i, j) + pert(i, j) |
||||
END DO |
||||
END DO |
||||
!$OMP END PARALLEL DO |
||||
|
||||
|
||||
END SUBROUTINE pert_add |
||||
|
||||
This code adds the 2D base and perturbation variables and stores the result in |
||||
a 2D output array. (For this example, we're using a 2D array to help |
||||
illustrate leftmost indexing below, but it could have been written using |
||||
a 1D or 3D array). |
||||
|
||||
At the top, there are these two f2py directives: |
||||
|
||||
.. code:: |
||||
|
||||
!f2py threadsafe |
||||
!f2py intent(in,out) :: total |
||||
|
||||
The *threadsafe* directive tells f2py to release Python's Global Interpreter |
||||
Lock (GIL) before calling the Fortran routine. The Fortran code no longer |
||||
uses Python variables, so you should relese the GIL before running the |
||||
computation. This way, Python threads will contine to run, which may be |
||||
important if you are using this in a webserver or in some other |
||||
threaded environment like dask's threaded scheduler. |
||||
|
||||
The *intent(in,out)* f2py directive is used because we will |
||||
be supplying a slice of the output array directly to this routine and don't |
||||
want to have to copy the result from Fortran back in to the result array. By |
||||
specifying intent(in,out), we're telling f2py to use the pointer to our |
||||
output array directly. |
||||
|
||||
Finally, for the OpenMP directive, the scheduler is set to use runtime |
||||
scheduling via *SCHEDULE(runtime)*. By using runtime scheduling, users |
||||
can set the scheduling type within Python, but for most users the default is |
||||
sufficient. |
||||
|
||||
|
||||
Building the Fortran Code |
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
||||
|
||||
To build the Fortran code, the *example.f90* source code should be placed in |
||||
the *fortran* directory of the source tree. |
||||
|
||||
Next, we need to update the numpy.distutils.core.Extension section of |
||||
*setup.py* in the root directory of the source tree. |
||||
|
||||
.. code:: python |
||||
|
||||
ext1 = numpy.distutils.core.Extension( |
||||
name="wrf._wrffortran", |
||||
sources=["fortran/wrf_constants.f90", |
||||
"fortran/wrf_testfunc.f90", |
||||
"fortran/wrf_user.f90", |
||||
"fortran/rip_cape.f90", |
||||
"fortran/wrf_cloud_fracf.f90", |
||||
"fortran/wrf_fctt.f90", |
||||
"fortran/wrf_user_dbz.f90", |
||||
"fortran/wrf_relhl.f90", |
||||
"fortran/calc_uh.f90", |
||||
"fortran/wrf_user_latlon_routines.f90", |
||||
"fortran/wrf_pvo.f90", |
||||
"fortran/eqthecalc.f90", |
||||
"fortran/wrf_rip_phys_routines.f90", |
||||
"fortran/wrf_pw.f90", |
||||
"fortran/wrf_vinterp.f90", |
||||
"fortran/wrf_wind.f90", |
||||
"fortran/omp.f90", |
||||
"fortran/example.f90 # New file added here |
||||
] |
||||
) |
||||
|
||||
The easiest way to build your code is to use one of the build scripts located |
||||
in the *build_scripts* directory of the source tree. These scripts contain |
||||
variants for compiling with or without OpenMP support. Unless you are |
||||
debugging a problem, building with OpenMP is recommended. |
||||
|
||||
For this example, we're going to assume you already followed how to |
||||
:ref:`dev_setup`. Below are the build instructions for compiling with |
||||
OpenMP enabled on GCC (Linux or Mac): |
||||
|
||||
.. code:: |
||||
|
||||
pip uninstall wrf-python |
||||
cd build_scripts |
||||
sh ./gnu_omp.sh |
||||
|
||||
The above command will build and install the new routine, along with the |
||||
other Fortran routines. If you recieve errors, then your code failed to |
||||
build sucessfully. Otherwise, your new routine can be called as |
||||
:meth:`wrf._wrffortran.pert_add`. |
||||
|
||||
|
||||
Creating a Thin Python Wrapper |
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
||||
|
||||
The new Fortran pert_add routine will work well for a 2D slice of data. |
||||
However, if you want to extend the functionality |
||||
to work with any dimensional array, you'll need to add a thin wrapper |
||||
with some extra functionality that make use of :mod:`wrapt` decorators. |
||||
|
||||
First, let's start by creating a very thin wrapper in Python in *extension.py*. |
||||
|
||||
.. code:: python |
||||
|
||||
from wrf._wrffortran import pert_add |
||||
|
||||
. |
||||
. |
||||
. |
||||
|
||||
def _pert_add(base, pert, outview=None): |
||||
"""Wrapper for pert_add. |
||||
|
||||
Located in example.f90. |
||||
|
||||
""" |
||||
if outview is None: |
||||
outview = np.empty(base.shape[0:2], base.dtype, order="F") |
||||
|
||||
result = pert_add(base, |
||||
pert, |
||||
outview) |
||||
|
||||
return result |
||||
|
||||
Despite being only a few lines of code, there is quite a bit going on in the |
||||
wrapper. The first thing to note is the arguments to the wrapper function. The |
||||
only arguments that we need for the wrapper are the inputs to the function |
||||
and an "outview" keyword argument. At this point in the call chain, the |
||||
arguments are assumed to be Fortran-ordered, in that the Fortran ordering flag |
||||
is set and the shape is transposed from a usual C-ordered numpy array |
||||
(the data itself remains in the same order that it was created). By passing |
||||
numpy arrays with the Fortran order flag set, f2py will pass the pointer |
||||
directly through to the Fortran routine. |
||||
|
||||
The *outview* keyword argument is used during leftmost dimension indexing to |
||||
send slices of the output array to the Fortran routine to be filled. If there |
||||
are no leftmost dimensions (e.g. this routine is called with 2D data), then the |
||||
outview argument will be None and an outview variable will be created with the |
||||
same number of dimensions as the *base* argument. It should be created with |
||||
Fortran ordering so that the pointer is directly passed to the Fortran routine. |
||||
|
||||
When the actual :meth:`wrf._wrffortran.pert_add` Fortran routine is called, |
||||
the nx and ny arguments are ommitted because f2py will supply this for us |
||||
based on the shape of the numpy arrays we are supplying as input arguments. |
||||
F2py also likes to return an array as a result, so even though we supplied |
||||
outview as an array to be filled by the Fortran routine, we will still get a |
||||
result from the function call that is pointing to the same thing as outview. |
||||
(We could have chosen to ignore the result and return outview instead). |
||||
|
||||
|
||||
Extract and Transpose |
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
||||
|
||||
The arrays that are being passed to the _pert_add thin wrapper need to be |
||||
numpy arrays in Fortran ordering, but they won't come this way from |
||||
users. They will come in as either :class:`numpy.ndarray` |
||||
or :class:`xarray.DataArray` and will be C-ordered. So, we need to make |
||||
sure that a Fortran-ordered :class:`numpy.ndarray` is what is passed to |
||||
the thin wrapper. |
||||
|
||||
Since this type of operation is repeated for many diagnostic functions, a |
||||
decorator has been written in *decorators.py* for this purpose. Let's decorate |
||||
our thin wrapper with this function. |
||||
|
||||
|
||||
.. code:: python |
||||
|
||||
@extract_and_transpose() |
||||
def _pert_add(base, pert, outview=None): |
||||
"""Wrapper for pert_add. |
||||
|
||||
Located in example.f90. |
||||
|
||||
""" |
||||
if outview is None: |
||||
outview = np.empty(base.shape[0:2], base.dtype, order="F") |
||||
|
||||
result = pert_add(base, |
||||
pert, |
||||
outview) |
||||
|
||||
return result |
||||
|
||||
|
||||
The :meth:`extract_and_transpose` decorator converts any argument to _pert_add |
||||
that are of type :class:`xarray.DataArray` to :class:`numpy.ndarray`, and then |
||||
gets the :attr:`numpy.ndarray.T` attribute, and passes this on to the |
||||
_pert_add wrapper. |
||||
|
||||
Following the computation, we want the result to be returned back as the |
||||
same C-ordered array types that went in as arguments, so this decorator takes |
||||
the result of the computation and returns the :attr:`numpy.ndarray.T` from the |
||||
Fortran-ordered result. This result gets passed back up the decorator chain. |
||||
|
||||
|
||||
Cast to Fortran Array Types |
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
||||
|
||||
The Fortran routine expects a specific data type for the arrays (usually |
||||
REAL(KIND=8)). WRF files typically store their data as 4-byte floating point |
||||
numbers to save space. The arrays being passed to the |
||||
:meth:`wrf.decorators.extract_and_transpose` decorator need to be converted |
||||
to the type used in the Fortran routine (e.g. double), then converted back to |
||||
the original type (e.g. float) after the computation is finished. This is |
||||
handled by the :meth:`wrf.decorators.cast_type` decorator function in |
||||
*decorators.py*. |
||||
|
||||
.. code:: python |
||||
|
||||
@cast_type(arg_idxs=(0, 1)) |
||||
@extract_and_transpose() |
||||
def _pert_add(base, pert, outview=None): |
||||
"""Wrapper for pert_add. |
||||
|
||||
Located in example.f90. |
||||
|
||||
""" |
||||
if outview is None: |
||||
outview = np.empty(base.shape[0:2], base.dtype, order="F") |
||||
|
||||
result = pert_add(base, |
||||
pert, |
||||
outview) |
||||
|
||||
return result |
||||
|
||||
The :meth:`wrf.decorators.cast_type` decorator function takes an |
||||
*arg_idxs* argument to specify which positional arguments need to be cast to |
||||
the Fortran algorithm type, in this case arguments 0 and 1 (base and pert). |
||||
|
||||
Following the computation, the result will be cast back to the original type |
||||
for the input arguments (usually float), and passed back up the decorator |
||||
chain. |
||||
|
||||
|
||||
Leftmost Dimension Indexing |
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
||||
|
||||
The WRF-Python algorithms written in Fortran are usually written for fixed |
||||
size arrays of 1, 2, or 3 dimensions. If your input arrays have more than |
||||
the number of dimensions specified for the Fortran algorithm, then we need to |
||||
do the following: |
||||
|
||||
1. Determine how many leftmost dimensions are used. |
||||
|
||||
2. Create an output array that has a shape that contains the leftmost |
||||
dimensions concatenated with the shape of the result from the Fortran |
||||
algorithm. |
||||
|
||||
3. Iterate over the leftmost dimensions and send slices of the input arrays |
||||
to the Fortran algorithm. |
||||
|
||||
4. Along with the input arrays above, send a slice of the output array to be |
||||
filled by the Fortran algorithm. |
||||
|
||||
5. Return the fully calculated output array. |
||||
|
||||
The :meth:`wrf.decorators.left_iteration` is general purpose decorator |
||||
contained in *decorators.py* to handle most leftmost index iteration cases. |
||||
(Note: Some products, like cape_2d, return multiple products in the output |
||||
and don't fall in to this generic category, so those decorators can be found |
||||
in *specialdec.py*) |
||||
|
||||
Let's look at how this is used below. |
||||
|
||||
.. code:: python |
||||
|
||||
@left_iteration(2, 2, ref_var_idx=0) |
||||
@cast_type(arg_idxs=(0, 1)) |
||||
@extract_and_transpose() |
||||
def _pert_add(base, pert, outview=None): |
||||
"""Wrapper for pert_add. |
||||
|
||||
Located in example.f90. |
||||
|
||||
""" |
||||
if outview is None: |
||||
outview = np.empty(base.shape[0:2], base.dtype, order="F") |
||||
|
||||
result = pert_add(base, |
||||
pert, |
||||
outview) |
||||
|
||||
return result |
||||
|
||||
|
||||
The :meth:`wrf.decorators.left_iteration` decorator handles many different |
||||
use cases with its arguments, but this example is one of the more common cases. |
||||
The 0th positional argument tells the decorator that the "reference" input |
||||
variable should provide at least two dimensions. This should be set to |
||||
the same number of dimensions as in the Fortran algorithm, which is two in this |
||||
case. Dimensions to the left of these two dimensions are considered "leftmost" |
||||
dimensions. |
||||
|
||||
The next positional argument (value of 2) tells the decorator that the |
||||
newly created output variable should retain the shape of the reference |
||||
variable's right two dimensions. This only applies when your output has less |
||||
dimensions than the reference variable (e.g. sea level pressure uses |
||||
geopotential height for the reference but produces 2D output). Since we are |
||||
not reducing the output dimensions, it should be set to the same value as the |
||||
previous argument. |
||||
|
||||
The final keyword argument of *ref_ver_idx* tells the decorator to use |
||||
positional argument 0 (for the _pert_add function) as the reference |
||||
variable. |
||||
|
||||
The result of this decorator will be the fully computed output array, which |
||||
gets passed back up the chain. |
||||
|
||||
|
||||
Checking Argument Shapes |
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
||||
|
||||
Before any computations can be performed, the argument shapes are checked to |
||||
verify their sizes. Although f2py will catch problems at the |
||||
entry point to the Fortran routine, the error thrown is confusing to |
||||
users. |
||||
|
||||
The :meth:`wrf.decorators.check_args` decorator is used to verify that the |
||||
arguments are the correct size before proceeding. |
||||
|
||||
Here is how it is used: |
||||
|
||||
|
||||
.. code:: python |
||||
|
||||
@check_args(0, 2, (2, 2)) |
||||
@left_iteration(2, 2, ref_var_idx=0) |
||||
@cast_type(arg_idxs=(0, 1)) |
||||
@extract_and_transpose() |
||||
def _pert_add(base, pert, outview=None): |
||||
"""Wrapper for pert_add. |
||||
|
||||
Located in example.f90. |
||||
|
||||
""" |
||||
if outview is None: |
||||
outview = np.empty(base.shape[0:2], base.dtype, order="F") |
||||
|
||||
result = pert_add(base, |
||||
pert, |
||||
outview) |
||||
|
||||
return result |
||||
|
||||
The 0th positional argument (value of 0), tells |
||||
:meth:`wrf.decorators.check_args` that the 0th positional argument of |
||||
_pert_add is the reference variable. |
||||
|
||||
The next postional argument (value of 2) tells |
||||
:meth:`wrf.decorators.check_args` that it should expect at least 2 dimensions |
||||
for the reference variable. This should be set to the number of dimensions |
||||
used in the Fortran algorithm, which is two in this case. |
||||
|
||||
The final positional argument is a tuple with the number of dimensions that |
||||
are expected for each array argument. Again, this should be set to the same |
||||
number of dimensions expected in the Fortran routine for each positional |
||||
argument. If an argument to your wrapped function is not an array type, you |
||||
can use None in the tuple to ignore it, but that is not applicable for this |
||||
example. |
||||
|
||||
|
||||
Putting It All Together |
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
||||
|
||||
The previous sections showed how the decorator chain was built up from the |
||||
_pert_add function. However, when you actually make a call to _pert_add, the |
||||
decorators are called from top to bottom. This means check_args is called |
||||
first, then left_iteration, then cast_type, then extract_and_transpose, |
||||
and finally _pert_add. After _pert_add is finished, the result is passed |
||||
back up the chain and back to the user. |
||||
|
||||
Now that we have a fully wrapped compiled routine, how might we use this? |
||||
|
||||
Let's make a new :meth:`wrf.getvar` diagnostic called 'total_pressure'. A |
||||
similar diagnostic already exists in WRF-Python, but this is just for |
||||
illustration of how to use our newly wrapped Fortran routine. |
||||
|
||||
|
||||
Make a 'getter' Function |
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
||||
|
||||
First, we need a 'getter' routine that extracts the required input variables |
||||
from the WRF NetCDF file(s) to perform the computation. In this case, the |
||||
variables are P and PB. |
||||
|
||||
The current naming convention in WRF-Python is to prefix the 'getter' |
||||
functions with a '\g\_', so let's call this file g_totalpres.py and make a |
||||
function get_total_pressure inside of it. |
||||
|
||||
The contents of this file will be: |
||||
|
||||
.. code:: python |
||||
|
||||
# g_totalpres.py |
||||
|
||||
from .extension import _pert_add |
||||
from .util import extract_vars |
||||
|
||||
@copy_and_set_metadata(copy_varname="P", name="total_pressure", |
||||
description="total pressure", |
||||
units="Pa") |
||||
def get_total_pressure(wrfin, timeidx=0, method="cat", squeeze=True, |
||||
cache=None, meta=True, _key=None): |
||||
"""Return total pressure. |
||||
|
||||
This functions extracts the necessary variables from the NetCDF file |
||||
object in order to perform the calculation. |
||||
|
||||
Args: |
||||
|
||||
wrfin (:class:`netCDF4.Dataset`, :class:`Nio.NioFile`, or an \ |
||||
iterable): WRF-ARW NetCDF |
||||
data as a :class:`netCDF4.Dataset`, :class:`Nio.NioFile` |
||||
or an iterable sequence of the aforementioned types. |
||||
|
||||
timeidx (:obj:`int` or :data:`wrf.ALL_TIMES`, optional): The |
||||
desired time index. This value can be a positive integer, |
||||
negative integer, or |
||||
:data:`wrf.ALL_TIMES` (an alias for None) to return |
||||
all times in the file or sequence. The default is 0. |
||||
|
||||
method (:obj:`str`, optional): The aggregation method to use for |
||||
sequences. Must be either 'cat' or 'join'. |
||||
'cat' combines the data along the Time dimension. |
||||
'join' creates a new dimension for the file index. |
||||
The default is 'cat'. |
||||
|
||||
squeeze (:obj:`bool`, optional): Set to False to prevent dimensions |
||||
with a size of 1 from being automatically removed from the |
||||
shape of the output. Default is True. |
||||
|
||||
cache (:obj:`dict`, optional): A dictionary of (varname, ndarray) |
||||
that can be used to supply pre-extracted NetCDF variables to |
||||
the computational routines. It is primarily used for internal |
||||
purposes, but can also be used to improve performance by |
||||
eliminating the need to repeatedly extract the same variables |
||||
used in multiple diagnostics calculations, particularly when |
||||
using large sequences of files. |
||||
Default is None. |
||||
|
||||
meta (:obj:`bool`, optional): Set to False to disable metadata and |
||||
return :class:`numpy.ndarray` instead of |
||||
:class:`xarray.DataArray`. Default is True. |
||||
|
||||
_key (:obj:`int`, optional): A caching key. This is used for |
||||
internal purposes only. Default is None. |
||||
|
||||
Returns: |
||||
:class:`xarray.DataArray` or :class:`numpy.ndarray`: Omega. |
||||
If xarray is |
||||
enabled and the *meta* parameter is True, then the result will be a |
||||
:class:`xarray.DataArray` object. Otherwise, the result will be a |
||||
:class:`numpy.ndarray` object with no metadata. |
||||
|
||||
""" |
||||
|
||||
# Get the base and perturbation pressures |
||||
varnames = ("PB", "P") |
||||
ncvars = extract_vars(wrfin, timeidx, varnames, method, squeeze, cache, |
||||
meta=False, _key=_key) |
||||
|
||||
pb = ncvars["PB"] |
||||
p = ncvars["P"] |
||||
|
||||
total_pres = _pert_add(pb, p) |
||||
|
||||
return total_pres |
||||
|
||||
|
||||
This getter function extracts the PB and P (base and pertrubation pressure) |
||||
variables and calls the _pert_add function and returns the result. The |
||||
arguments *wrfin*, *timeidx*, *method*, *squeeze*, *cache*, *meta*, and |
||||
*_key* are used for every getter function and you can read what they do in |
||||
the docstring. |
||||
|
||||
The getter function is also decorated with a |
||||
:meth:`wrf.decorators.copy_and_set_metadata` decorator. This is a general |
||||
purpose decorator that is used for copying metadata from an input variable |
||||
to the result. In this case, the variable to copy is 'P'. The *name* parameter |
||||
specifies that the :attr:`xarray.DataArray.name` attribute for the variable |
||||
(the name that will be written to a NetCDF variable). The *description* is a |
||||
brief description for variable that will be placed in the |
||||
:attr:`xarray.DataArray.attrs` dictionary along with the *units* parameter. |
||||
|
||||
|
||||
Make Your New Diagnostic Available in :meth:`wrf.getvar` |
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
||||
|
||||
The final step is to make the new 'total_pressure' diagnostic available from |
||||
:meth:`wrf.getvar`. To do this, modifications need to be made to |
||||
*routines.py*. |
||||
|
||||
First, import your new getter routine at the top of routines.py. |
||||
|
||||
.. code:: python |
||||
|
||||
from __future__ import (absolute_import, division, print_function) |
||||
|
||||
from .util import (get_iterable, is_standard_wrf_var, extract_vars, |
||||
viewkeys, get_id) |
||||
from .g_cape import (get_2dcape, get_3dcape, get_cape2d_only, |
||||
get_cin2d_only, get_lcl, get_lfc, get_3dcape_only, |
||||
get_3dcin_only) |
||||
. |
||||
. |
||||
. |
||||
from .g_cloudfrac import (get_cloudfrac, get_low_cloudfrac, |
||||
get_mid_cloudfrac, get_high_cloudfrac) |
||||
from .g_totalpres import get_total_pressure |
||||
|
||||
|
||||
Next, update _FUNC_MAP to map your diagnostic label ('total_pressure') |
||||
to the getter routine (get_total_pres). |
||||
|
||||
.. code:: python |
||||
|
||||
_FUNC_MAP = {"cape2d": get_2dcape, |
||||
"cape3d": get_3dcape, |
||||
. |
||||
. |
||||
. |
||||
"high_cloudfrac": get_high_cloudfrac, |
||||
"total_pressure": get_total_pressure |
||||
} |
||||
|
||||
|
||||
Finally, update _VALID_KARGS to inform :meth:`wrf.getvar` of any additional |
||||
keyword argument names that this routine might use. The :meth:`wrf.getvar` |
||||
routine will check keyword arguments and throws an error when it gets any that |
||||
are not declared in this map. |
||||
|
||||
In this case, there aren't any addtional keyword arguments, so we'll just |
||||
supply an empty list. |
||||
|
||||
.. code:: python |
||||
|
||||
_VALID_KARGS = {"cape2d": ["missing"], |
||||
"cape3d": ["missing"], |
||||
"dbz": ["do_variant", "do_liqskin"], |
||||
"maxdbz": ["do_variant", "do_liqskin"], |
||||
. |
||||
. |
||||
. |
||||
"high_cloudfrac": ["vert_type", "low_thresh", |
||||
"mid_thresh", "high_thresh"], |
||||
"total_pressure": [] |
||||
} |
||||
|
||||
After this is complete, your new routine is now available for use from |
||||
:meth:`wrf.getvar`. |
||||
|
||||
|
||||
|
||||
|
||||
|
@ -0,0 +1,3 @@
@@ -0,0 +1,3 @@
|
||||
If you only wish to submit a Fortran contribution, without supplying any |
||||
wrappers or other Python code, please submit your code to this |
||||
directory. |
@ -0,0 +1,20 @@
@@ -0,0 +1,20 @@
|
||||
# Create full conda environment for development, including some useful tools |
||||
name: develop |
||||
channels: |
||||
- conda-forge |
||||
dependencies: |
||||
- python=3 |
||||
- wrapt |
||||
- numpy |
||||
- matplotlib |
||||
- netcdf4 |
||||
- xarray |
||||
- jupyter |
||||
- sphinx |
||||
- sphinx_rtd_theme |
||||
- pycodestyle |
||||
- cartopy |
||||
- basemap |
||||
- gcc_linux-64 |
||||
- gfortran_linux-64 |
||||
|
@ -0,0 +1,20 @@
@@ -0,0 +1,20 @@
|
||||
# Create full conda environment for development, including some useful tools |
||||
name: develop |
||||
channels: |
||||
- conda-forge |
||||
dependencies: |
||||
- python=3 |
||||
- wrapt |
||||
- numpy |
||||
- matplotlib |
||||
- netcdf4 |
||||
- xarray |
||||
- jupyter |
||||
- sphinx |
||||
- sphinx_rtd_theme |
||||
- pycodestyle |
||||
- cartopy |
||||
- basemap |
||||
- clang_osx-64 |
||||
- gfortran_osx-64 |
||||
|
@ -0,0 +1 @@
@@ -0,0 +1 @@
|
||||
This directory is for user contributed tests. |
@ -0,0 +1,141 @@
@@ -0,0 +1,141 @@
|
||||
{ |
||||
"cells": [ |
||||
{ |
||||
"cell_type": "code", |
||||
"execution_count": 24, |
||||
"metadata": {}, |
||||
"outputs": [ |
||||
{ |
||||
"name": "stdout", |
||||
"output_type": "stream", |
||||
"text": [ |
||||
"<xarray.DataArray 'T2' (Time: 4, bottom_top: 1, south_north: 96, west_east: 96)>\n", |
||||
"array([[[[303.6255 , ..., 303.81546],\n", |
||||
" ...,\n", |
||||
" [304.7401 , ..., 300.15717]]],\n", |
||||
"\n", |
||||
"\n", |
||||
" ...,\n", |
||||
"\n", |
||||
"\n", |
||||
" [[[301.89346, ..., 302.56534],\n", |
||||
" ...,\n", |
||||
" [302.61246, ..., 298.01028]]]], dtype=float32)\n", |
||||
"Coordinates:\n", |
||||
" * Time (Time) datetime64[ns] 2005-08-28 2005-08-28T03:00:00 ...\n", |
||||
" XTIME (Time) float64 0.0 180.0 360.0 540.0\n", |
||||
" XLAT (south_north, west_east) float32 21.302004 21.302004 21.302004 ...\n", |
||||
" XLONG (south_north, west_east) float32 -90.57406 -90.484116 ...\n", |
||||
"Dimensions without coordinates: bottom_top, south_north, west_east\n", |
||||
"Attributes:\n", |
||||
" FieldType: 104\n", |
||||
" MemoryOrder: XY \n", |
||||
" description: TEMP at 2 M\n", |
||||
" units: K\n", |
||||
" stagger: \n", |
||||
" coordinates: XLONG XLAT XTIME\n", |
||||
" projection: Mercator(stand_lon=-89.0, moad_cen_lat=27.99999237060547, t...\n" |
||||
] |
||||
} |
||||
], |
||||
"source": [ |
||||
"from __future__ import print_function, division\n", |
||||
"\n", |
||||
"import os\n", |
||||
"import numpy as np\n", |
||||
"from netCDF4 import Dataset\n", |
||||
"from wrf import getvar, ALL_TIMES, to_np\n", |
||||
"import xarray\n", |
||||
"\n", |
||||
"path = '/scratch/mawagner/2015_GRELL3D/TESTFILES/TestTime'\n", |
||||
"filename_list = os.listdir(path)\n", |
||||
"filename_list.sort()\n", |
||||
"\n", |
||||
"#filename_list = [\"/Users/ladwig/Documents/wrf_files/wrf_vortex_single/wrfout_d02_2005-08-28_00:00:00\",\n", |
||||
"# \"/Users/ladwig/Documents/wrf_files/wrf_vortex_single/wrfout_d02_2005-08-28_03:00:00\",\n", |
||||
"# \"/Users/ladwig/Documents/wrf_files/wrf_vortex_single/wrfout_d02_2005-08-28_06:00:00\",\n", |
||||
"# \"/Users/ladwig/Documents/wrf_files/wrf_vortex_single/wrfout_d02_2005-08-28_09:00:00\"]\n", |
||||
"\n", |
||||
"# Result shape \n", |
||||
"result_shape = (6, 1, 290, 265)\n", |
||||
"#result_shape = (4, 1, 96, 96)\n", |
||||
"\n", |
||||
"# Let's get the first time so we can copy the metadata later\n", |
||||
"f = Dataset(filename_list[0])\n", |
||||
"# By setting squeeze to False, you'll get all the dimension names.\n", |
||||
"z1 = getvar(f, \"T2\", 0, squeeze=False)\n", |
||||
"xlat = getvar(f, \"XLAT\", 0)\n", |
||||
"xlong = getvar(f, \"XLONG\", 0)\n", |
||||
"\n", |
||||
"\n", |
||||
"z_final = np.empty(result_shape, np.float32)\n", |
||||
"\n", |
||||
"# Modify this number if using more than 1 time per file\n", |
||||
"times_per_file = 1\n", |
||||
"#times_per_file = 4\n", |
||||
"\n", |
||||
"data_times = []\n", |
||||
"xtimes = []\n", |
||||
"for timeidx in range(result_shape[0]):\n", |
||||
" # Compute the file index and the time index inside the file\n", |
||||
" fileidx = timeidx // times_per_file\n", |
||||
" file_timeidx = timeidx % times_per_file\n", |
||||
"\n", |
||||
" f = Dataset(filename_list[fileidx])\n", |
||||
" z = getvar(f, \"T2\", file_timeidx)\n", |
||||
" t = getvar(f, \"Times\", file_timeidx)\n", |
||||
" xt = getvar(f, \"xtimes\", file_timeidx)\n", |
||||
" data_times.append(to_np(t))\n", |
||||
" xtimes.append(to_np(xt))\n", |
||||
" z_final[timeidx,:] = z[:]\n", |
||||
" f.close()\n", |
||||
" \n", |
||||
"# Let's make the metadata. Dimension names should copy easily if you set sqeeze to False, \n", |
||||
"# otherwise you can just set them yourself is a tuple of dimension names. Since you wanted\n", |
||||
"# to keep the bottom_top dimension for this 2D variable (which is normally removed), \n", |
||||
"# I'm doing this manually.\n", |
||||
"z_dims = [\"Time\", \"bottom_top\", \"south_north\", \"west_east\"]\n", |
||||
"\n", |
||||
"# Xarray doesn't copy coordinates easily (it always complains about shape mismatches), so do this\n", |
||||
"# manually\n", |
||||
"z_coords = {}\n", |
||||
"z_coords[\"Time\"] = data_times\n", |
||||
"z_coords[\"XTIME\"] = (\"Time\",), xtimes\n", |
||||
"z_coords[\"XLAT\"] = (\"south_north\", \"west_east\"), xlat\n", |
||||
"z_coords[\"XLONG\"] = (\"south_north\", \"west_east\"), xlong\n", |
||||
"z_name = \"T2\"\n", |
||||
"\n", |
||||
"# Attributes copy nicely\n", |
||||
"z_attrs = {}\n", |
||||
"z_attrs.update(z1.attrs)\n", |
||||
"\n", |
||||
"z_with_meta = xarray.DataArray(z_final, coords=z_coords, dims=z_dims, attrs=z_attrs, name=z_name)\n", |
||||
"\n", |
||||
"print(z_with_meta)\n", |
||||
" \n", |
||||
" " |
||||
] |
||||
} |
||||
], |
||||
"metadata": { |
||||
"kernelspec": { |
||||
"display_name": "Python 3", |
||||
"language": "python", |
||||
"name": "python3" |
||||
}, |
||||
"language_info": { |
||||
"codemirror_mode": { |
||||
"name": "ipython", |
||||
"version": 3 |
||||
}, |
||||
"file_extension": ".py", |
||||
"mimetype": "text/x-python", |
||||
"name": "python", |
||||
"nbconvert_exporter": "python", |
||||
"pygments_lexer": "ipython3", |
||||
"version": "3.6.5" |
||||
} |
||||
}, |
||||
"nbformat": 4, |
||||
"nbformat_minor": 2 |
||||
} |
@ -0,0 +1,259 @@
@@ -0,0 +1,259 @@
|
||||
{ |
||||
"cells": [ |
||||
{ |
||||
"cell_type": "code", |
||||
"execution_count": null, |
||||
"metadata": {}, |
||||
"outputs": [], |
||||
"source": [ |
||||
"import tempfile\n", |
||||
"import glob\n", |
||||
"import shutil\n", |
||||
"import os\n", |
||||
"import numpy as np\n", |
||||
"from netCDF4 import Dataset\n", |
||||
"from wrf import getvar, ll_to_xy, CoordPair, GeoBounds, to_np\n", |
||||
"\n", |
||||
"_VARS_TO_KEEP = (\"Times\", \"XLAT\", \"XLONG\", \"XLAT_U\", \"XLAT_V\", \"XLONG_U\", \n", |
||||
" \"XLONG_V\", \"U\", \"V\", \"W\", \"PH\", \"PHB\", \"T\", \"P\", \"PB\", \"Q2\", \n", |
||||
" \"T2\", \"PSFC\", \"U10\", \"V10\", \"XTIME\", \"QVAPOR\", \"QCLOUD\", \n", |
||||
" \"QGRAUP\", \"QRAIN\", \"QSNOW\", \"QICE\", \"MAPFAC_M\", \"MAPFAC_U\",\n", |
||||
" \"MAPFAC_V\", \"F\", \"HGT\", \"RAINC\", \"RAINSH\", \"RAINNC\", \"I_RAINC\", \"I_RAINNC\",\n", |
||||
" \"PBLH\")\n", |
||||
"\n", |
||||
"class FileReduce(object):\n", |
||||
" def __init__(self, filenames, geobounds, tempdir=None, vars_to_keep=None, \n", |
||||
" max_pres=None, compress=False, delete=True, reuse=False):\n", |
||||
" \"\"\"An iterable object for cutting out geographic domains.\n", |
||||
" \n", |
||||
" Args:\n", |
||||
" \n", |
||||
" filenames (sequence): A sequence of file paths to the WRF files\n", |
||||
" \n", |
||||
" geobounds (GeoBounds): A GeoBounds object defining the region of interest\n", |
||||
" \n", |
||||
" tempdir (str): The location to store the temporary cropped data files. If None, tempfile.mkdtemp is used.\n", |
||||
" \n", |
||||
" vars_to_keep (sequence): A sequence of variables names to keep from the original file. None for all vars.\n", |
||||
" \n", |
||||
" max_press (float): The maximum pressure height level to keep. None for all levels.\n", |
||||
" \n", |
||||
" compress(bool): Set to True to enable zlib compression of variables in the output.\n", |
||||
" \n", |
||||
" delete (bool): Set to True to delete the temporary directory when FileReduce is garbage collected.\n", |
||||
" \n", |
||||
" reuse (bool): Set to True when you want to resuse the files that were previously converted. *tempdir* \n", |
||||
" must be set to a specific directory that contains the converted files and *delete* must be False.\n", |
||||
" \n", |
||||
" \n", |
||||
" \"\"\"\n", |
||||
" self._filenames = filenames\n", |
||||
" self._i = 0\n", |
||||
" self._geobounds = geobounds\n", |
||||
" self._delete = delete\n", |
||||
" self._vars_to_keep = vars_to_keep\n", |
||||
" self._max_pres = max_pres\n", |
||||
" self._compress = compress\n", |
||||
" self._cache = set()\n", |
||||
" self._own_data = True\n", |
||||
" self._reuse = reuse\n", |
||||
" \n", |
||||
" if tempdir is not None:\n", |
||||
" if not os.path.exists(tempdir):\n", |
||||
" os.makedirs(tempdir)\n", |
||||
" self._tempdir = tempdir\n", |
||||
" if self._reuse:\n", |
||||
" self._cache = set((os.path.join(self._tempdir, name) \n", |
||||
" for name in os.listdir(self._tempdir)))\n", |
||||
" else:\n", |
||||
" self._tempdir = tempfile.mkdtemp()\n", |
||||
"\n", |
||||
" print (\"temporary directory is: {}\".format(self._tempdir))\n", |
||||
" self._prev = None\n", |
||||
" self._set_extents()\n", |
||||
" \n", |
||||
" def _set_extents(self):\n", |
||||
" fname = list(self._filenames)[0]\n", |
||||
" with Dataset(fname) as ncfile:\n", |
||||
" lons = [self._geobounds.bottom_left.lon, self._geobounds.top_right.lon]\n", |
||||
" lats = [self._geobounds.bottom_left.lat, self._geobounds.top_right.lat]\n", |
||||
" orig_west_east = len(ncfile.dimensions[\"west_east\"])\n", |
||||
" orig_south_north = len(ncfile.dimensions[\"south_north\"])\n", |
||||
" orig_bottom_top = len(ncfile.dimensions[\"bottom_top\"])\n", |
||||
" \n", |
||||
" # Note: Not handling the moving nest here\n", |
||||
" # Extra points included around the boundaries to ensure domain is fully included\n", |
||||
" x_y = ll_to_xy(ncfile, lats, lons, meta=False)\n", |
||||
" self._start_x = 0 if x_y[0,0] == 0 else x_y[0,0] - 1\n", |
||||
" self._end_x = orig_west_east - 1 if x_y[0,1] >= orig_west_east - 1 else x_y[0,1] + 1\n", |
||||
" self._start_y = 0 if x_y[1,0] == 0 else x_y[1,0] - 1\n", |
||||
" self._end_y = orig_south_north - 1 if x_y[1,1] >= orig_south_north - 1 else x_y[1,1] + 1\n", |
||||
" \n", |
||||
" self._west_east = self._end_x - self._start_x + 1\n", |
||||
" self._west_east_stag = self._west_east + 1\n", |
||||
" self._south_north = self._end_y - self._start_y + 1\n", |
||||
" self._south_north_stag = self._south_north + 1\n", |
||||
" \n", |
||||
" # Crop the vertical to the specified pressure\n", |
||||
" if self._max_pres is not None:\n", |
||||
" pres = getvar(ncfile, \"pressure\")\n", |
||||
" # Find the lowest terrain height\n", |
||||
" ter = to_np(getvar(ncfile, \"ter\"))\n", |
||||
" min_ter = float(np.amin(ter)) + 1\n", |
||||
" ter_less = ter <= min_ter\n", |
||||
" ter_less = np.broadcast_to(ter_less, pres.shape)\n", |
||||
" # For the lowest terrain height, find the lowest vertical index to meet \n", |
||||
" # the desired pressure level. The lowest terrain height will result in the \n", |
||||
" # largest vertical spread to find the pressure level.\n", |
||||
" x = np.transpose(((pres.values <= self._max_pres) & ter_less).nonzero())\n", |
||||
" self._end_bot_top = np.amin(x, axis=0)[0] \n", |
||||
" if (self._end_bot_top >= orig_bottom_top):\n", |
||||
" self._end_bot_top = orig_bottom_top - 1\n", |
||||
" else:\n", |
||||
" self._end_bot_top = orig_bottom_top - 1\n", |
||||
" \n", |
||||
" self._bottom_top = self._end_bot_top + 1\n", |
||||
" self._bottom_top_stag = self._bottom_top + 1\n", |
||||
" \n", |
||||
" print(\"bottom_top\", self._bottom_top)\n", |
||||
" \n", |
||||
" \n", |
||||
" def __iter__(self):\n", |
||||
" return self\n", |
||||
" \n", |
||||
" def __copy__(self):\n", |
||||
" cp = type(self).__new__(self.__class__)\n", |
||||
" cp.__dict__.update(self.__dict__)\n", |
||||
" cp._own_data = False\n", |
||||
" cp._delete = False\n", |
||||
" \n", |
||||
" return cp\n", |
||||
" \n", |
||||
" def __del__(self):\n", |
||||
" if self._delete:\n", |
||||
" shutil.rmtree(self._tempdir)\n", |
||||
" \n", |
||||
" def reduce(self, fname):\n", |
||||
" outfilename = os.path.join(self._tempdir, os.path.basename(fname))\n", |
||||
" \n", |
||||
" # WRF-Python can iterate over sequences several times during a 'getvar', so a cache is used to \n", |
||||
" if outfilename in self._cache:\n", |
||||
" return Dataset(outfilename)\n", |
||||
" \n", |
||||
" # New dimension sizes\n", |
||||
" dim_d = {\"west_east\" : self._west_east,\n", |
||||
" \"west_east_stag\" : self._west_east_stag,\n", |
||||
" \"south_north\" : self._south_north,\n", |
||||
" \"south_north_stag\" : self._south_north_stag,\n", |
||||
" \"bottom_top\" : self._bottom_top,\n", |
||||
" \"bottom_top_stag\" : self._bottom_top_stag\n", |
||||
" }\n", |
||||
" \n", |
||||
" # Data slice sizes for the 2D dimensions\n", |
||||
" slice_d = {\"west_east\" : slice(self._start_x, self._end_x + 1),\n", |
||||
" \"west_east_stag\" : slice(self._start_x, self._end_x + 2),\n", |
||||
" \"south_north\" : slice(self._start_y, self._end_y + 1),\n", |
||||
" \"south_north_stag\" : slice(self._start_y, self._end_y + 2),\n", |
||||
" \"bottom_top\" : slice(None, self._end_bot_top + 1),\n", |
||||
" \"bottom_top_stag\" : slice(None, self._end_bot_top + 2)\n", |
||||
" }\n", |
||||
" \n", |
||||
" with Dataset(fname) as infile, Dataset(outfilename, mode=\"w\") as outfile:\n", |
||||
" \n", |
||||
" # Copy the global attributes\n", |
||||
" outfile.setncatts(infile.__dict__)\n", |
||||
"\n", |
||||
" # Copy Dimensions, limiting south_north and west_east to desired domain\n", |
||||
" for name, dimension in infile.dimensions.items():\n", |
||||
" dimsize = dim_d.get(name, len(dimension))\n", |
||||
" outfile.createDimension(name, dimsize)\n", |
||||
"\n", |
||||
" # Copy Variables \n", |
||||
" for name, variable in infile.variables.items():\n", |
||||
" if self._vars_to_keep is not None:\n", |
||||
" if name not in self._vars_to_keep:\n", |
||||
" continue\n", |
||||
" \n", |
||||
" print (name)\n", |
||||
" new_slices = tuple((slice_d.get(dimname, slice(None)) for dimname in variable.dimensions))\n", |
||||
"\n", |
||||
" outvar = outfile.createVariable(name, variable.datatype, variable.dimensions, zlib=self._compress)\n", |
||||
"\n", |
||||
" outvar[:] = variable[new_slices]\n", |
||||
"\n", |
||||
" outvar.setncatts(variable.__dict__)\n", |
||||
" \n", |
||||
" \n", |
||||
" result = Dataset(outfilename)\n", |
||||
" \n", |
||||
" self._cache.add(outfilename)\n", |
||||
" \n", |
||||
" return result\n", |
||||
" \n", |
||||
" \n", |
||||
" def next(self):\n", |
||||
" if self._i >= len(self._filenames):\n", |
||||
" if self._prev is not None:\n", |
||||
" self._prev.close()\n", |
||||
" raise StopIteration\n", |
||||
" else:\n", |
||||
" fname = self._filenames[self._i]\n", |
||||
" reduced_file = self.reduce(fname)\n", |
||||
" if self._prev is not None:\n", |
||||
" self._prev.close()\n", |
||||
" self._prev = reduced_file\n", |
||||
" \n", |
||||
" self._i += 1\n", |
||||
" \n", |
||||
" return reduced_file\n", |
||||
" \n", |
||||
" # Python 3\n", |
||||
" def __next__(self):\n", |
||||
" return self.next()\n", |
||||
"\n", |
||||
"# How to use with getvar\n", |
||||
"# Set lower left and upper right to your desired domain\n", |
||||
"# Idaho bounding box: [\"41.9880561828613\",\"49.000846862793\",\"-117.243034362793\",\"-111.043563842773\"]\n", |
||||
"ll = CoordPair(lat=41.8, lon=-117.26)\n", |
||||
"ur = CoordPair(lat=49.1, lon=-110.5)\n", |
||||
"bounds = GeoBounds(ll, ur)\n", |
||||
"reduced_files = FileReduce(glob.glob(\"/Users/ladwig/Documents/wrf_files/boise_tutorial/orig/wrfout_*\"),\n", |
||||
" bounds, vars_to_keep=_VARS_TO_KEEP, max_pres=400,\n", |
||||
" tempdir=\"/Users/ladwig/Documents/wrf_files/boise_tutorial/reduced\", \n", |
||||
" delete=False, reuse=True)\n", |
||||
"\n", |
||||
"pres = getvar(reduced_files, \"pressure\")\n", |
||||
"\n", |
||||
"print(pres)" |
||||
] |
||||
}, |
||||
{ |
||||
"cell_type": "code", |
||||
"execution_count": null, |
||||
"metadata": {}, |
||||
"outputs": [], |
||||
"source": [] |
||||
} |
||||
], |
||||
"metadata": { |
||||
"kernelspec": { |
||||
"display_name": "Python 3", |
||||
"language": "python", |
||||
"name": "python3" |
||||
}, |
||||
"language_info": { |
||||
"codemirror_mode": { |
||||
"name": "ipython", |
||||
"version": 3 |
||||
}, |
||||
"file_extension": ".py", |
||||
"mimetype": "text/x-python", |
||||
"name": "python", |
||||
"nbconvert_exporter": "python", |
||||
"pygments_lexer": "ipython3", |
||||
"version": "3.7.1" |
||||
} |
||||
}, |
||||
"nbformat": 4, |
||||
"nbformat_minor": 2 |
||||
} |
@ -0,0 +1,22 @@
@@ -0,0 +1,22 @@
|
||||
# Create full conda environment for development, including some useful tools |
||||
name: testenv2 |
||||
channels: |
||||
- conda-forge |
||||
dependencies: |
||||
- python=2 |
||||
- wrapt |
||||
- numpy |
||||
- matplotlib |
||||
- netcdf4 |
||||
- xarray |
||||
- jupyter |
||||
- sphinx |
||||
- sphinx_rtd_theme |
||||
- pycodestyle |
||||
- cartopy |
||||
- basemap |
||||
- clang_osx-64 |
||||
- gfortran_osx-64 |
||||
- pynio |
||||
- ncl |
||||
|
@ -0,0 +1,22 @@
@@ -0,0 +1,22 @@
|
||||
# Create full conda environment for development, including some useful tools |
||||
name: testenv3 |
||||
channels: |
||||
- conda-forge |
||||
dependencies: |
||||
- python=3 |
||||
- wrapt |
||||
- numpy |
||||
- matplotlib |
||||
- netcdf4 |
||||
- xarray |
||||
- jupyter |
||||
- sphinx |
||||
- sphinx_rtd_theme |
||||
- pycodestyle |
||||
- cartopy |
||||
- basemap |
||||
- clang_osx-64 |
||||
- gfortran_osx-64 |
||||
- pynio |
||||
- ncl |
||||
|
@ -0,0 +1,19 @@
@@ -0,0 +1,19 @@
|
||||
# Create full conda environment for development, including some useful tools |
||||
name: develop |
||||
channels: |
||||
- conda-forge |
||||
dependencies: |
||||
- python=3 |
||||
- wrapt |
||||
- numpy |
||||
- matplotlib |
||||
- netcdf4 |
||||
- xarray |
||||
- jupyter |
||||
- sphinx |
||||
- sphinx_rtd_theme |
||||
- pycodestyle |
||||
- cartopy |
||||
- basemap |
||||
- m2w64-toolchain |
||||
|
Loading…
Reference in new issue