/[escript]/branches/doubleplusgood/doc/inversion/CookGravity.tex
ViewVC logotype

Diff of /branches/doubleplusgood/doc/inversion/CookGravity.tex

Parent Directory Parent Directory | Revision Log Revision Log | View Patch Patch

revision 4344 by jfenwick, Wed Feb 27 03:42:40 2013 UTC revision 4345 by jfenwick, Fri Mar 29 07:09:41 2013 UTC
# Line 9  The driver functions enable geologists a Line 9  The driver functions enable geologists a
9  \downunder module quickly and in an easy way without requiring detailed  \downunder module quickly and in an easy way without requiring detailed
10  knowledge of the theory behind inversion or programming skills.  knowledge of the theory behind inversion or programming skills.
11  However, users who are interested in specializing or extending the inversion  However, users who are interested in specializing or extending the inversion
12  capacity are referred to the Part~\ref{part2} of this manual.  capacity are referred to Part~\ref{part2} of this manual.
13  It is beyond the intention of this manual to give a detailed introduction to  It is beyond the intention of this manual to give a detailed introduction to
14  geophysical inversion, in particular to the appropriate preprocessing of data  geophysical inversion, in particular to the appropriate preprocessing of data
15  sets. We refer to~\cite{REF1, REF2, REF3}.  sets.
16    
17  The \downunder module described here is designed to calculate estimations for  The \downunder module described here is designed to calculate estimations for
18  the 3-D distribution of density and/or susceptibility from 2-D gravity and  the 3-D distribution of density and/or susceptibility from 2-D gravity and
# Line 40  chapter on gravity data. Line 40  chapter on gravity data.
40    
41  To run the examples discussed you need to have \escript (version 3.3.1 or  To run the examples discussed you need to have \escript (version 3.3.1 or
42  newer) installed on your computer.  newer) installed on your computer.
43  Moreover, if you want to visualize the results you must have access to a  Moreover, if you want to visualize the results you need to have access to a
44  plotting software which is able to process \VTK input files, e.g. \mayavi and \VisIt.  data plotting software which is able to process \VTK input files, e.g.
45  As \mayavi can easily be obtained and installed for most platforms the  \mayavi or \VisIt.
46  tutorial is based on \mayavi as visualization. However, it is pointed out  As \mayavi can be easily obtained and installed for most platforms the
47  that \VisIt is the preferred visualization tool for \escript as it can deal  tutorial includes commands to visualize output files using \mayavi .
48  with very large data sets more efficiently.  However, it is pointed out that \VisIt is the preferred visualization tool for
49    \escript as it can deal with very large data sets more efficiently.
50            
51  \begin{figure}  \begin{figure}
52  \centering  \centering
53  \includegraphics[width=0.7\textwidth]{QLDWestGravityDataPlot.png}  \includegraphics[width=0.7\textwidth]{QLDWestGravityDataPlot.png}
54  \caption{Gravity Anomaly Data in $mgal$ from Western Queensland, Australia  \caption{Gravity Anomaly Data in $mgal$ from Western Queensland, Australia
55      (see file \examplefile{data/QLDWest_grav.nc}). \AZADEH{Tracey R, Bacchin M, \& Wynne P. 2008. In preparation. AAGD07: A new absolute gravity datum for Australian gravity and new standards for the Australian National Gravity Database. Exploration Geophysics.}}      (file \examplefile{data/QLDWestGravity.nc}). Data obtained from Geoscience Australia.}
56    %\AZADEH{Tracey R, Bacchin M, \& Wynne P. 2008. In preparation. AAGD07: A new absolute gravity datum for Australian gravity and new standards for the Australian National Gravity Database. Exploration Geophysics.}}
57  \label{FIG:P1:GRAV:0}  \label{FIG:P1:GRAV:0}
58  \end{figure}  \end{figure}
59    
60  \begin{figure}  \begin{figure}
61  \centering%  \centering%
62  \includegraphics[width=0.9\textwidth]{QLDGravContourMu10.png}  \includegraphics[width=0.9\textwidth]{QLDGravContourMu10.png}
63  \caption{3D contour plot of the density distribution obtained by inversion of  \caption{3-D contour plot of the density distribution obtained by inversion of
64      file \file{data/QLDWest_grav.nc} (with $\mu=10$).      file \file{data/QLDWestGravity.nc} (with $\mu=10$).
65      Colours represent values of density where high values are represented by      Colours represent values of density where high values are represented by
66      blue and low values are represented by red.}      blue and low values are represented by red.}
67  \label{FIG:P1:GRAV:1}  \label{FIG:P1:GRAV:1}
# Line 102  dom.setFractionalPadding(pad_x=0.2, pad_ Line 104  dom.setFractionalPadding(pad_x=0.2, pad_
104  dom.fixDensityBelow(depth=40.*U.km)  dom.fixDensityBelow(depth=40.*U.km)
105    
106  # Step 2: read gravity data  # Step 2: read gravity data
107  source0=NetCdfData(NetCdfData.GRAVITY, 'data/QLDWest_grav.nc', altitude=0.)  source0=NetCdfData(NetCdfData.GRAVITY, 'GravitySmall.nc')
108  dom.addSource(source0)  dom.addSource(source0)
109    
110  # Step 3: set up inversion  # Step 3: set up inversion
# Line 127  It is recommended to use the extension \ Line 129  It is recommended to use the extension \
129  \Python script.  \Python script.
130  We will discuss the statements of the script later in this chapter.  We will discuss the statements of the script later in this chapter.
131    
132  Obviously the inversion needs to be fed with some gravity data. In this case  Obviously the inversion needs to be fed with some gravity data. You can find
133  the data are loaded from the file \examplefile{data/QLDWest_grav.nc} which is  example data from western Queensland, Australia in two resolutions in the
134  available in the \escript example file directory.  \escript example directory. In this case the data are loaded from the file
135  The data are given in the \netcdf file format for gridded data.  \examplefile{GravitySmall.nc} which is given in the \netcdf file format.
136  After you have copied this file into the directory in which you have saved the  After you have copied this file into the directory in which you have saved the
137  script \file{grav.py} you can run the program using the command line  script \file{grav.py} you can run the program using the command line
138  \begin{verbatim}  \begin{verbatim}
# Line 149  of the script. Line 151  of the script.
151  The file has the \VTK format and can be imported easily into many  The file has the \VTK format and can be imported easily into many
152  visualization tools.  visualization tools.
153  One option is the \mayavi package which is available on most platforms.  One option is the \mayavi package which is available on most platforms.
154  You can invoke the the visualization using the commands  You can invoke the visualization using the commands
155  \begin{verbatim}  \begin{verbatim}
156  mayavi2 -d result.vtu -m SurfaceMap  mayavi2 -d result.vtu -m SurfaceMap
157  \end{verbatim}  \end{verbatim}
158  from the command line.  from the command line.
159  Figure~\ref{FIG:P1:GRAV:1} shows the result of this inversion as a contour  Figure~\ref{FIG:P1:GRAV:1} shows the result of this inversion as a contour
160  plot\footnote{These plots were generated by \VisIt.}, while the gravity  plot\footnote{These plots were generated by \VisIt using the higher resolution
161  anomaly data is shown in Figure~\ref{FIG:P1:GRAV:0}.  data.}, while the gravity anomaly data is shown in Figure~\ref{FIG:P1:GRAV:0}.
162  We will discuss data later in Section~\ref{SEC:P1:GRAV:DATA}.  We will discuss data later in Section~\ref{SEC:P1:GRAV:DATA}.
163    
164  Let us have a closer look at the script\footnote{In \Python lines starting  Let us take a closer look at the script\footnote{In \Python lines starting
165  with `\#` are comments and are not processed further.}: Besides the header  with `\#` are comments and are not processed further.}. Besides the header
166  section one can separate the script into five steps:  section one can separate the script into five steps:
167  \begin{enumerate}  \begin{enumerate}
168      \item set up domain on which the inversion problem is solved      \item set up domain on which the inversion problem is solved
# Line 187  altered except if additional modules are Line 189  altered except if additional modules are
189  \end{figure}  \end{figure}
190    
191  \section{Creating the Inversion Domain}  \section{Creating the Inversion Domain}
192  First step in script~\ref{code: gravity1} is the definition of the domain over  First step in Script~\ref{code: gravity1} is the definition of the domain over
193  which the inversion is performed.  which the inversion is performed.
194  We think of the domain as a block with orthogonal, plain faces.  We think of the domain as a block with orthogonal, plain faces.
195  Figure~\ref{FIG:P1:GRAV:2} shows the set-up for a two-dimensional domain  Figure~\ref{FIG:P1:GRAV:2} shows the set-up for a two-dimensional domain
196  (see also Figure~\ref{fig:domainBuilder} for 3-D).  (see also Figure~\ref{fig:domainBuilder} for 3-D).
197  The lateral coordinates along the surface of the Earth are denoted by $x$ and  The lateral coordinates along the surface of the Earth are denoted by $x$ and
198  $y$ (only $x$-direction is shown).  $y$ (only $x$-direction is used in 2-D).
199  The $z$ direction defines the vertical direction where the part above the  The $z$ direction defines the vertical direction where the part above the
200  surface has positive coordinates and the subsurface negative coordinates.  surface has positive coordinates and the subsurface negative coordinates.
201  The height of the section above the surface, which is assumed to be filled  The height of the section above the surface, which is assumed to be filled
# Line 265  As in the case discussed here if the dep Line 267  As in the case discussed here if the dep
267  the depth of the domain no constraint at depth is applied to the density.  the depth of the domain no constraint at depth is applied to the density.
268    
269  \downunder uses the metre-kilogram-second based International System of Units  \downunder uses the metre-kilogram-second based International System of Units
270  (SI)~\footnote{see \url{http://en.wikipedia.org/wiki/International_System_of_Units}}\index{SI}.  (SI)\footnote{see \url{http://en.wikipedia.org/wiki/International_System_of_Units}}\index{SI}.
271  So all values must be converted to appropriate units.  So all values must be converted to appropriate units.
272  This does not apply to geographic coordinates which in \downunder are given in  This does not apply to geographic coordinates which in \downunder are given in
273  fractional degrees (as a floating point number) to represent longitude and  fractional degrees (as a floating point number) to represent longitude and
# Line 274  latitude. In the script we have used the Line 276  latitude. In the script we have used the
276  depth=40.*U.km  depth=40.*U.km
277  \end{verbatim}  \end{verbatim}
278  to define the depth of the domain to $40 km$.  to define the depth of the domain to $40 km$.
279  The expression \verb|U.km| is the unit $km$ (kilometer) and an appropriate  The expression \verb|U.km| denotes the unit $km$ (kilometer) and ensures
280  conversion of $40$ as length in $km$ into the base unit $m$ (meter).  appropriate conversion of the value $40$ into the base unit $m$ (meter).
281  It is recommended to add units to values (where present) in order to make sure  It is recommended to add units to values (where present) in order to make sure
282  that the final values handed into \downunder is given with the appropriate  that the final values handed into \downunder is given with the appropriate
283  units.  units.
# Line 309  The data set covers a rectangular region Line 311  The data set covers a rectangular region
311  and between $20^o$ and $21^o$ south.  and between $20^o$ and $21^o$ south.
312  Notice that latitude varies between $-90^o$ to $90^o$ where negative signs  Notice that latitude varies between $-90^o$ to $90^o$ where negative signs
313  refer to places in the southern hemisphere and longitude varies between  refer to places in the southern hemisphere and longitude varies between
314  $-180^o$ to $180^o$ where negative signs refer to places in the western  $-180^o$ to $180^o$ where negative signs refer to places west of Greenwich.
 hemisphere.  
315  The colour at a location represents the value of the vertical Bouguer gravity  The colour at a location represents the value of the vertical Bouguer gravity
316  anomaly at this point at the surface of the Earth.  anomaly at this point at the surface of the Earth.
317  Values range from $-160 \; mgal$ to about $500 \; mgal$\footnote{The unit  Values in this data set range from $-160 \; mgal$ to about $500 \; mgal$\footnote{The unit
318  $mgal$ means milli $gal$ (galileo) with $1 \; gal = 0.01 \frac{m}{sec^2}$.}  $mgal$ means milli $gal$ (galileo) with $1 \; gal = 0.01 \frac{m}{sec^2}$.}
319  over a $121 \times 121$ grid.  over a $121 \times 121$ grid.
320    
321  In general, a patch of gravity data need to be defined over a plane  In general, a patch of gravity data needs to be defined over a plane
322  \verb|NX| $\times$ \verb|NY| where \verb|NX| and \verb|NY| define the number  \verb|NX| $\times$ \verb|NY| where \verb|NX| and \verb|NY| define the number
323  of grid lines in the longitude (\verb|X|) and the latitude (\verb|Y|)  of grid lines in the longitude (\verb|X|) and the latitude (\verb|Y|)
324  direction, respectively.  direction, respectively.
# Line 325  The grid is spanned from an origin with Line 326  The grid is spanned from an origin with
326  \verb|DELTA_Y| in the longitude and the latitude direction, respectively.  \verb|DELTA_Y| in the longitude and the latitude direction, respectively.
327  The gravity data for all grid points need to be given as an \verb|NX|  The gravity data for all grid points need to be given as an \verb|NX|
328  $\times$ \verb|NY| array.  $\times$ \verb|NY| array.
329  If available errors can be associated with the gravity data.  If available, measurement errors can be associated with the gravity data.
330  The values are given as an \verb|NX| $\times$ \verb|NY| array matching the  The values are given as an \verb|NX| $\times$ \verb|NY| array matching the
331  shape of the gravity array.  shape of the gravity array.
332  The \Python script \examplefile{create_netcdf.py} shows how to create a file  Note that data need not be available on every single point of the grid, see
333    Section~\ref{SEC:P1:GRAV:REMARK:DATAHOLES} for more information on this.
334    
335    Currently, two data file formats are supported, namely \emph{ER Mapper Raster}
336    \cite{ERMAPPER} files and \netcdf~\cite{NETCDF} files.
337    In the examples of this chapter we use \netcdf files and refer to
338    Section~\ref{SEC:P1:GRAV:REMARK:ERMAPPER} and
339    Section~\ref{sec:ref:DataSource:ERM} for more information on using ER Mapper
340    Raster files.
341    If you have data in any other format you have the option of writing a suitable
342    reader (for advanced users, see Chapter~\ref{Chp:ref:data sources}) or,
343    assuming you are able to read the data in \Python, refer to the example
344    script \examplefile{create_netcdf.py} which shows how to create a file
345  in the \netcdf file format~\cite{NETCDF} compatible with \downunder from  in the \netcdf file format~\cite{NETCDF} compatible with \downunder from
346  a data array.  a data array.
 see Section~\ref{SEC:P1:GRAV:REMARK:DATAHOLES}  
 BLA  
   
 \CIHAN{ADD an example how ro use ER Mapper raster files}  
   
 In addition to the \netcdf file format  
347    
   
 Two data formats are supported, namely ER Mapper Raster File and \netcdf.  
 Here we will focus on the \netcdf file format~\cite{NETCDF} and refer to  
 subsection~\ref{SEC:P1:GRAV:REMARK:ERMAPPER} and  
 Section~\ref{sec:ref:DataSource:ERM} for more information on the ER Mapper  
 Raster format.  
   
 BLA  
348  In script~\ref{code: gravity1} we use the statement  In script~\ref{code: gravity1} we use the statement
349  \begin{verbatim}  \begin{verbatim}
350  source0=NetCdfData(NetCdfData.GRAVITY, 'data/QLDWest_grav.nc', altitude=0.)  source0=NetCdfData(NetCdfData.GRAVITY, 'GravitySmall.nc')
351  \end{verbatim}  \end{verbatim}
352  \CIHAN{REWISE} \AZADEH{ADD coarse data 10x10 or so?}  to load the gravity data stored in \examplefile{GravitySmall.nc} in the
353  to load the gravity data stored in \examplefile{data/QLDWest_grav.nc} in the \netcdf format.  \netcdf format.
354  Within the script the data set is now available under the name \verb|source0|.  Within the script the data set is now available under the name \verb|source0|.
355  We need to link the data set to the \verb|DomainBuilder| using  We need to link the data set to the \verb|DomainBuilder| using
356  \begin{verbatim}  \begin{verbatim}
# Line 366  system. This is achieved by projecting t Line 365  system. This is achieved by projecting t
365  \emph{Universal Transverse Mercator} (UTM) coordinate system\footnote{See e.g.  \emph{Universal Transverse Mercator} (UTM) coordinate system\footnote{See e.g.
366  \url{http://en.wikipedia.org/wiki/Universal_Transverse_Mercator_coordinate_system}}.  \url{http://en.wikipedia.org/wiki/Universal_Transverse_Mercator_coordinate_system}}.
367    
368  It is possible to add several, possibly overlapping data sets to the same  There are a few optional arguments you can add when constructing a data source.
369  domain:  While Section~\ref{sec:ref:DataSource} has a detailed description of all
370    arguments it is worth noting a few.
371    Firstly, it is important to know that data slices are assumed to be at altitude
372    $0$ by default. This can be easily changed though:
373    \begin{verbatim}
374    source0=NetCdfData(NetCdfData.GRAVITY, 'GravitySmall.nc', altitude=2.5*U.km)
375    \end{verbatim}
376    Another important setting is the scale or unit of the measurements.
377    The default is dependent on the data type and for gravity anomalies a scale
378    of $\frac{\mu m}{sec^2}$ (or $0.1 \; mgal$) is assumed. For instance to
379    change the default scale to $mgal$ (which is $10^{-5} \frac{m}{sec^2}$),
380    you could use:
381    \begin{verbatim}
382    source0=NetCdfData(NetCdfData.GRAVITY, 'GravitySmall.nc', scale_factor=U.mgal)
383    \end{verbatim}
384    Finally, it is possible to specify measurement errors (i.e. uncertainties)
385    alongside the data.
386    Since these can never be zero, a value of $2$ units is used if nothing else
387    has been specified.
388    The error value is assumed to be given in the same units as the data so the
389    default value translates to an error of $0.2 \; mgal$.
390    There are two possibilities to specify the error, namely by providing a
391    constant value which is applied to all data points:
392  \begin{verbatim}  \begin{verbatim}
393  source0=NetCdfData(NetCdfData.GRAVITY, 'data0.nc', altitude=0.)  source0=NetCdfData(NetCdfData.GRAVITY, 'GravitySmall.nc', error=1.7)
394  source1=NetCdfData(NetCdfData.GRAVITY, 'data1.nc', altitude=0.)  \end{verbatim}
395  dom.addSource(source0)  or, if the information is available in the same \netcdf file under the name
396  dom.addSource(source1)  \verb|errors|, provide \downunder with the appropriate variable name:
397    \begin{verbatim}
398    source0=NetCdfData(NetCdfData.GRAVITY, 'GravitySmall.nc', error="errors")
399  \end{verbatim}  \end{verbatim}
400  However, due to the coordinate transformation all data sets must be located in  
 the same UTM zone. If a single dataset crosses UTM zones only the zone of the  
 central longitude is used when projecting.  
 For example, if one data set lies mostly in zone 51 but contains areas of zone  
 52, it is transformed using zone 51.  
 In this case more data from zone 51 can be added, but not from any other zone.  
401    
402  It is important to keep an eye on the complexity of the inversion.  It is important to keep an eye on the complexity of the inversion.
403  A good measure is the total number of cells being used.  A good measure is the total number of cells being used.
# Line 402  This estimate of complexity growth appli Line 420  This estimate of complexity growth appli
420  of data grid size is driven by an increase of resolution where it is  of data grid size is driven by an increase of resolution where it is
421  recommended to increase the vertical resolution in synch with the lateral  recommended to increase the vertical resolution in synch with the lateral
422  resolution. Note that if more than one data set is used the target resolution  resolution. Note that if more than one data set is used the target resolution
423  will be the resolution of the finest data set.  will be the resolution of the finest data set (see also
424    Section~\ref{SEC:P1:GRAV:REMARK:MULTIDATA}).
425  In other cases the expansion of the region of interest drives an increase of  In other cases the expansion of the region of interest drives an increase of
426  data grid size and the increase of total number of cells is less dramatic as  data grid size and the increase of total number of cells is less dramatic as
427  the vertical number of cells can remain constant while keeping a balanced  the vertical number of cells can remain constant while keeping a balanced
# Line 440  process is aborted and an error message Line 459  process is aborted and an error message
459  In this case you can try to rerun the inversion with a larger value for the  In this case you can try to rerun the inversion with a larger value for the
460  maximum number of iteration steps.  maximum number of iteration steps.
461  If even for a very large number of iteration steps no convergence is achieved,  If even for a very large number of iteration steps no convergence is achieved,
462  this is a strong indicator that the inversion has not been set up properly.  it is very likely that the inversion has not been set up properly.
463    
464  The statement  The statement
465  \begin{verbatim}  \begin{verbatim}
# Line 585  trade-off factors. Line 604  trade-off factors.
604  For larger values of the trade-off factor the density distribution becomes  For larger values of the trade-off factor the density distribution becomes
605  rougher showing larger details.  rougher showing larger details.
606  Computational costs are significantly higher for larger trade-off factors.  Computational costs are significantly higher for larger trade-off factors.
607  Moreover, noise in the data has an higher impact on the result.  Moreover, noise in the data has a higher impact on the result.
608  Typically several runs are required to adjust the value for the trade-off  Typically several runs are required to adjust the value for the trade-off
609  factor to the datasets used.  factor to the datasets used.
610    
611  For some analysis tools it is useful to process the results in form of Comma  For some analysis tools it is useful to process the results in form of
612  Separated Values (\CSV)~\footnote{see  Comma-separated Values (\CSV)\footnote{see
613  \url{http://en.wikipedia.org/wiki/Comma-separated_values}.}.  \url{http://en.wikipedia.org/wiki/Comma-separated_values}}.
614  Such a file can be created using the statement  Such a file can be created using the statement
615  \begin{verbatim}  \begin{verbatim}
616  saveDataCSV("result.csv", x=rho.getFunctionSpace().getX(), density=rho)  saveDataCSV("result.csv", x=rho.getFunctionSpace().getX(), density=rho)
# Line 601  This will create a \file{result.csv} wit Line 620  This will create a \file{result.csv} wit
620  Each row contains the value of the density distribution and the three  Each row contains the value of the density distribution and the three
621  coordinates of the corresponding location in the domain.  coordinates of the corresponding location in the domain.
622  There is a header specifying the meaning of the corresponding column.  There is a header specifying the meaning of the corresponding column.
 \LG{Add example output CSV}.  
623  Notice that rows are not written in a particular order and therefore, if  Notice that rows are not written in a particular order and therefore, if
624  necessary, the user has to apply appropriate sorting of the rows.  necessary, the user has to apply appropriate sorting of the rows.
625  Columns are written in alphabetic order of their corresponding tag names.  Columns are written in alphabetic order of their corresponding tag names.
# Line 610  the type used to store the density data Line 628  the type used to store the density data
628  The \verb|getX()| method returns the coordinates of the sampling points used  The \verb|getX()| method returns the coordinates of the sampling points used
629  for the particular type of representation, see~\cite{ESCRIPT} for details.  for the particular type of representation, see~\cite{ESCRIPT} for details.
630    
   
631  \section{Remarks}  \section{Remarks}
632    
633  \subsection{Data With Holes}\label{SEC:P1:GRAV:REMARK:DATAHOLES}  \subsection{ER Mapper Raster Files}\label{SEC:P1:GRAV:REMARK:ERMAPPER}
634  \CIHAN{ADD some remarks holes in the data, example?}  The \downunder module can read data stored in ER Mapper Raster files. A data
635    set in this format consists of two files, a header file whose name usually ends
636    in \verb|.ers| and the actual data file which has the same filename as the
637    header file but without any file extension.
638    These files are usually produced by a commercial software package and the
639    contents can be quite diverse.
640    Therefore, it is not guaranteed that every data set is supported by \downunder
641    but the most common types of raster data should work\footnote{If your data
642    does not load please contact us through \url{https://launchpad.net/escript-finley}.}.
643    
644    The interface for loading ER Mapper files is very similar to the \netcdf
645    interface described in Section~\ref{SEC:P1:GRAV:DATA}.
646    To load gravity data stored in the file pair\footnote{These files are available
647    in the example directory.} \examplefile{GravitySmall.ers} (the
648    header) and \examplefile{GravitySmall} (the data) without changing any of the
649    defaults use:
650    \begin{verbatim}
651    source0=ErMapperData(ErMapperData.GRAVITY, 'GravitySmall.ers')
652    \end{verbatim}
653    If your data set does not follow the default naming convention you can specify
654    the name of the data file explicitly:
655    \begin{verbatim}
656    source0=ErMapperData(ErMapperData.GRAVITY, 'GravitySmall.ers',
657                         datafile='GravityData')
658    \end{verbatim}
659    Please note that there is no way for the reader to determine if the two files
660    really form a pair so make sure to pass the correct filenames when constructing
661    the reader object.
662    The same optional arguments explained in sections \ref{SEC:P1:GRAV:DATA} and
663    \ref{SEC:P1:GRAV:REMARK:DATAHOLES} are available for ER Mapper data sets.
664    However, due to the limitation of the file format only a constant error value
665    is supported.
666    
667    \subsection{Data With Holes}\label{SEC:P1:GRAV:REMARK:DATAHOLES}
668    As described previously in this chapter input data is always given in the form
669    of a rectangular grid with constant cell size in each dimension.
670    However, there are cases when this is not necessarily the case.
671    Consider an onshore data set which includes parts of the offshore region as in
672    Figure~\ref{FIG:P1:GRAV:onshoreoffshore}.
673    The valid data in this example has a value range of about $-600$ to $600$ and
674    the inversion is to be run based on these values only, disregarding the
675    offshore region.
676    In order to achieve that, the offshore region is \emph{masked} by using a
677    constant value which is not found within the onshore area.
678    Figure~\ref{FIG:P1:GRAV:onshoreoffshore} clearly shows this masked area in
679    dark blue since a mask value of $-1000$ was used.
680    
681    \begin{figure}[ht]
682        \centering
683        \includegraphics[width=0.45\textwidth]{onshore-offshore.png}
684        \caption{Plot of a rectangular gridded onshore data set that includes
685        offshore regions which have a value (here $-1000$) not found within the
686        real data (Bouguer anomalies in Tasmania, courtesy Geoscience Australia)}
687        \label{FIG:P1:GRAV:onshoreoffshore}
688    \end{figure}
689    
690  \subsection{ER Mapper Raster File}\label{SEC:P1:GRAV:REMARK:ERMAPPER}  The \netcdf conventions supported in \downunder include a standard way of
691  \CIHAN{ADD an example how to use ER Mapper raster files}  specifying such a mask value.
692    The example script \examplefile{create_netcdf.py} demonstrates how this is
693    accomplished in an easy way with any data.
694    If, for any reason, the mask value in the input file is invalid it can be
695    overridden via the \verb|null_value| argument when constructing the
696    \verb|NetCdfData| object:
697    \begin{verbatim}
698    source0=NetCdfData(NetCdfData.GRAVITY, 'data0.nc', null_value=-1000)
699    \end{verbatim}
700    In this example, all data points that have a value of $-1000$ are ignored and
701    not used in the inversion.
702    Please note that the special value \emph{NaN} (not-a-number) is sometimes used
703    for the purposes of masking in data sets.
704    Areas marked with this value are always disregarded in \downunder.
705    
706    \subsection{Multiple Data Sets}\label{SEC:P1:GRAV:REMARK:MULTIDATA}
707    It is possible to run a single inversion using more than one input data set,
708    possibly in different file formats.
709    To do so, simply create the data sources and add them to the domain builder:
710    \begin{verbatim}
711    source0=NetCdfData(NetCdfData.GRAVITY, 'data0.nc')
712    source1=ErMapperData(ErMapperData.GRAVITY, 'data1.ers')
713    dom.addSource(source0)
714    dom.addSource(source1)
715    \end{verbatim}
716    However, there are some restrictions when combining data sets:
717    \begin{itemize}
718        \item Due to the coordinate transformation all data sets must be located in
719            the same UTM zone. If a single dataset crosses UTM zones only the zone
720            of the central longitude is used when projecting.
721            For example, if one data set lies mostly in zone 51 but contains areas
722            of zone 52, it is transformed using zone 51. In this case more data
723            from zone 51 can be added, but not from any other zone.
724        \item All data sets should have the same spatial resolution but this is not
725            enforced. Combining data with different resolution is currently
726            considered experimental but works best when the resolutions are
727            multiples of each other. For example if the first data set has a
728            resolution (or cell size) of $100$ metres and the second has a cell
729            size of $50$ metres then the target domain will have a cell size of
730            $50$ metres (the finer resolution) and each point of the coarse data
731            will occupy two cells (in the respective dimension).
732    \end{itemize}
733    
734  \subsection{Regularization Term}\label{SEC:P1:GRAV:REMARK:REG}  \subsection{Regularization Term}\label{SEC:P1:GRAV:REMARK:REG}
735  The \verb|GravityInversion| class supports the following form for the  The \verb|GravityInversion| class supports the following form for the
# Line 648  inv.setup(dom, w0=0.1, w1=[0,0,1]) Line 759  inv.setup(dom, w0=0.1, w1=[0,0,1])
759  \end{verbatim}  \end{verbatim}
760  would lead to the same regularization as the statement above.  would lead to the same regularization as the statement above.
761    
 %  
 % \section{Reference}  
 %  
 % There are three examples for 2D and 3D gravity inversions with artificial input data.  
 % In first step, an area with synthetic density section was suggested. Then based on forward modeling its gravitational data was collected. Afterwards with generated gravity data, escripts simulated a volume of inverted density. Whilst new density mass could be compared with the synthetic density section to verify the inversion.  
 %  
 % Some of the presumptions were the same for all of the examples to simplify the situation to make a logical comparison between synthetic input and output. which is as followed:  
 %  
 % \begin{verbatim}  
 % n_cells_in_data=100  
 % depth_offset=0.*U.km  
 % l_data = 100 * U.km  
 % l_pad=40*U.km  
 % THICKNESS=20.*U.km  
 % l_air=6*U.km  
 % \end{verbatim}  
 %  
 % The others assumptions comes with each examples.  
 % \begin{enumerate}  
 % \item  A 2D density section with a maximum in center was assumed. The reference density and inverted will be shown. The padding area is excluded. (\ref{fig:gravity2D1})  
 %  
 % \begin{verbatim}  
 % n_cells_in_data=100  
 % n_humbs_h= 1  
 % n_humbs_v=1  
 % mu=100  
 % \end{verbatim}  
 %  
 % \begin{figure}  
 % \centering  
 % \includegraphics[width=\textwidth]{grav2D1.png}  
 % \caption{2D density model up) reference    down) result}  
 % \label{fig:gravity2D1}  
 % \end{figure}  
 %  
 %  
 % \item A 2D density properties with two maximum in corners and one minimum in the center was inverted. The result have eliminated the effects in padding area.(\ref{fig:gravity2D3})  
 %  
 % \begin{verbatim}  
 % n_cells_in_data=100  
 % n_humbs_h= 3  
 % n_humbs_v=1  
 % mu=100  
 % \end{verbatim}  
 %  
 % \begin{figure}  
 % \centering  
 % \includegraphics[width=\textwidth]{grav2D3.png}  
 % \caption{2D density model up) reference  down) result}  
 % \label{fig:gravity2D3}  
 % \end{figure}  
 %  
 % \item A 3D model with a maximum in the center was used as input data and the result after simulation in shown in the next image which determined a good distribution of density through the model in the main area.(\ref{fig:gravity3D} and \ref{fig:gravity3D1})  
 %  
 % \begin{verbatim}  
 % n_cells_in_data=50  
 % n_humbs_h= 1  
 % n_humbs_v=1  
 % mu=10  
 % \end{verbatim}  
 %  
 % \begin{figure}  
 % \centering  
 % \includegraphics[width=\textwidth]{density3D-ref.png}  
 % \caption{3D density model of reference as synthetic data}  
 % \label{fig:gravity3D}  
 % \end{figure}  
 %  
 % \begin{figure}  
 % \centering  
 % \includegraphics[width=\textwidth]{gravity3D.png}  
 % \caption{3D density model of result}  
 % \label{fig:gravity3D1}  
 % \end{figure}  
 % \end{enumerate}  
   

Legend:
Removed from v.4344  
changed lines
  Added in v.4345

  ViewVC Help
Powered by ViewVC 1.1.26