Paul Julian II, PhD

Ecologist, Wetland Biogeochemist, Data-scientist, lover of Rstats.

NetCDF data and R

Written on May 15, 2019

Keywords: data format, NetCDF, R,


Apologies for the extreme hiatus in blog posts. I am going to ease back into things by discussing NetCDF files and how to work with them in R. In recent years, NetCDF files have gained popularity especially for climatology, meteorology, remote sensing, oceanographic and GIS data applications. As you would have guessed it, NetCDF is an acronym that stands for Network Common Data Form. NetCDF and can be described as an open-source array-oriented data storage format.

Here are a few examples of data stored in NetCDF format:

For this blog post I will walk through my exploration of working with NetCDF files in R using the Everglades water level data (above) as an example. The data is actually a double whammy, its geospatial and hydrologic data. This blog post is ultimately bits and pieces of other blog posts scatter around the web. I hope you find it useful.


First lets load the necessary R-packages/libraries.

Navigate to the data website and pick a data file. I decided on 2017_Q4 (link will download a file). Once the file is downloaded, unzip the file using either R with the unzip() function or some other decompression software/operating system tools.

Alright now that the file is downloaded and unzipped lets open the file. I created an variable called working.dir which is a string indicating my “working directory” location (working.dir<-"D:/UF/_Working_Blog/NetCDFBlog") and points me to where I downloaded and unzipped the data file.

Excellent! The data is now in the R-environment, let take a look at the data.

## File D:/UF/_Working_Blog/NetCDFBlog/USGS_EDEN/2017_q4.nc (NC_FORMAT_CLASSIC):
## 
##      2 variables (excluding dimension variables):
##         float stage[x,y,time]   
##             long_name: stage
##             esri_pe_string: PROJCS["NAD_1983_UTM_Zone_17N",GEOGCS["GCS_North_American_1983",DATUM["D_North_American_1983",SPHEROID["GRS_1980",6378137.0,298.257222101]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",500000.0],PARAMETER["False_Northing",0.0],PARAMETER["Central_Meridian",-81.0],PARAMETER["Scale_Factor",0.9996],PARAMETER["Latitude_Of_Origin",0.0],UNIT["Meter",1.0]]
##             coordinates: x y
##             grid_mapping: transverse_mercator
##             units: cm
##             min: -8.33670043945312
##             max: 499.500701904297
##         int transverse_mercator[]   
##             grid_mapping_name: transverse_mercator
##             longitude_of_central_meridian: -81
##             latitude_of_projection_origin: 0
##             scale_factor_at_central_meridian: 0.9996
##             false_easting: 5e+05
##             false_northing: 0
##             semi_major_axis: 6378137
##             inverse_flattening: 298.257222101
## 
##      3 dimensions:
##         time  Size:92
##             long_name: model timestep
##             units: days since 2017-10-01T12:00:00Z
##         y  Size:405
##             long_name: y coordinate of projection
##             standard_name: projection_y_coordinate
##             units: m
##         x  Size:287
##             long_name: x coordinate of projection
##             standard_name: projection_x_coordinate
##             units: m
## 
##     2 global attributes:
##         Conventions: CF-1.0
##         Source_Software: JEM NetCDF writer

As you can see all the metadata needed to understand what the dat.nc file is lives in the file. Definately a pro for using NetCDF files. Now lets extract our three dimensions (latitude, longitude and time).

Now that we have our coordinates in space and time extracted let get to the actual data. If you remember when we viewed the dataset using print() sifting through the output you’ll notice long_name: stage, that is our data.

We can also extract other information including global attributes such as title, source, references, etc. from the datasets metadata.

Now that we got everything we wanted from the NetCDF file we can “close” the connection by using nc_close(dat.nc).

The data we just extracted from this NetCDF file is a time-series of daily spatially interpolated water level data across a large ecosystem (several thousand km2). Using the chron library lets format this data to something more workable by extracting the date information from tunits$value and time variables.

We know from both the metadata information within the file and information from where we retrieved the data that this NetCDF is an array representing the fourth quarter of 2017. This information combined with the extracted day, month and year information above we can determine the date corresponding to each file within the NetCDF array

## [1] "2017-10-01" "2017-10-02" "2017-10-03" "2017-10-04" "2017-10-05"
## [6] "2017-10-06"

Like most data, sometime values of a variable are missing or not available (outside of modeling domain) and are identified using a specific “fill value” (_FillValue) or “missing value” (missing_value). To replace theses values its pretty simple and similar to that of a normal data frame.

To count the number of data points (i.e. non-NA values) you can use length(na.omit(as.vector(tmp_array[,,1])))

Now that the data is extracted, sorted and cleaned lets take a couple of slices from this beautiful NetDF pie.

## [1] "2017-10-01"
## [1] "2017-10-20"

Before we get to the mapping nitty gritty visit my prior blog post on Mapping in #rstats. To generate the maps I will be using the tmap library.

If you notice the slice1 and slice2 are “Large matrix” files and not spatial files yet. Remember back when we looked at the metadata in the file header using print(dat.nc)? It identified the spatial projection of the data in the esri_pe_string: field. During the process of extracting the NetCDF slice and the associated transformation to a raster the data is i the wrong orientation therefore we have to use the flip() function to get it pointed in the correct direction.

Now lets take a look at our work!

Now that these slices are raster files, you can take it one step further and do raster based maths…for instance seeing the change in water level between the two dates.