The Peng-Robinson equation of state was used to model the vapor-liquid equilibrium. Newton-Raphson method was applied to obtain both the convergence of the Rachford-Rice equation and the cubic equation roots.
The root for each phase was determined by the Gibbs energy minimization. There is a quantitative agreement between the results computed and the results provided by the commercial simulator HYSYS AspenTech over industrially relevant ranges of compositions, pressures and temperatures. Furthermore, the answers obtained are within precision which apply to compositional modeling for petroleum fluids. The composition and compressibility factor simulations for petroleum fluid were performed using real field conditions.
The results found were investigated from the thermodynamic viewpoint and were consistent with both practice and theory. Artigo Completo - Open Access. Artigo Completo: The phase behavior prediction is essential for the development and optimization of the hydrocarbon production in petroleum engineering applications.
THERMODYNAMIC MODELING OF VAPOR-LIQUID EQUILIBRIUM FOR PETROLEUM FLUIDS - Blucher Proceedings
Download PDF. Cenage Learning. Danesh, A. Edinburgh, Scotland: Ed. An integrated approach for rapid phase behavior calculations in compositional modeling. Journal of Petroleum Science and Engineering, , The isothermal flash problem.
Part II. Phase-split calculation. Fluid Phase Equilibria, v. Speeding up the two-phase PT-flash, with applications for calculation of miscible displacement. Fluid Phase Equilibria, , Holte, Denmark: Ed. Tie-Line Publications. A new two-constant equation of state. Software design is also an important aspect to be considered in parallel computing. An application can be transformed to parallelism if it is possible to identify parts of the program that can be executed independently and simultaneously on separate processors.
Programs usually need to be adapted to this type of processing, in order to take advantage of parallel machines, which is not always a trivial task. There are programming tools to make easier this adaptation, for instance object-oriented programming techniques that support parallel programming.
Generally speaking, software can be classified into two groups: propietary or open, depending on whether the code has an owner or not. Open software allows many people to take part in its development and testing, and the resulting products can be used by anyone.
It makes possible to have high-quality products at a low price and create more standard products. The UNIX operating system illustrates this subject, since at the very beginning it was open and free, and only later it became proprietary. Unix was developed for powerful computers like workstations and has replaced many other proprietary operating systems. Currently, an important example of this Open-systems philosophy is the GNU project acronym for "GNU's Not Unix" , sponsored by the Free Software Foundation, which have allowed advanced users to modify codes at a higher level of complexity than closed software allows.
The power of the Open-systems style has been increased by the widespread use of Internet. Thus, nowadays it is easy to find and use powerful, free, open-source, and multi-platform tools. These include programming languages e. This operating system has become an increasingly popular and robust tool. Lately, with new technological developments in networking, it became possible to join many interconnected Von Neumann-type machines into a large parallel machine.
Thus, with the development of computer networks and the increasing capabilities of the general-purpose PC, a more affordable option was made available: to use PC processors in a network working on parallel programs. Therefore, Linux has become an increasingly interesting option that enables the use of a group of heterogeneous Unix computers connected to a network that operates as a single machine that solves large problems.
There are developments that integrate hardware and free software developments, and can be used on these systems. Below are cases of computing techniques, applied to exploration and production, to illustrate the feasibility of method and result enhancements through the use of these technologies. Seismic waves are an important source of information about the geometrical and physical properties of rocks, required in hydrocarbon exploration. The technology that takes advantage of wave propagation to obtain geological information is known as the seismic method.
To obtain seismic data, elastic waves are generated on the Earth's surface and later are detected there, after being reflected on the rock interfaces. This raw data needs processing in order to obtain seismic sections, which are interpreted subsequently to obtain geological models. Sophisticated algorithms have been developed to process seismic data. Seismic migration is an important example of such processing techniques. In migration a wave propagation model is applied to the recorded wave field, to obtain the properties of geological interfaces reflectors.
On the other hand, seismic modeling is the numerical simulation of wave field propagation applied to a geological model. Seismic modeling is another useful tool of the seismic method, since it helps to understand the relationship between geological models and seismic images. Taking into account their basic principles, migration and modeling can be considered inverse processes from one another. Migration and modeling background is the mathematical models of wave propagation.
A variety of such models, with differences in accuracy and complexity have been used. An example is the acoustic model, the most commonly found in the seismic method, which only considers compressional waves the only wave mode that propagates through fluids. Furthermore, the model can be 2-D or 3-D. Typically, isotropy and homogeneity of the medium are the basic hypothesis. The acoustic model is the basis of almost all the many migration algorithms currently used. The elastic model, besides compressional waves includes shear or secondary waves, typical of solids.
Even though this model is much more difficult to implement, it is more accurate, and there is important research in that direction. Also, there is research to apply more complete mathematical models which include anisotropic and anelastic properties. Despite the fact that the mathematical model of elastic wave propagation can encompass many subtleties, simplified versions are used in practical applications.
The main reason for this limitation relies on the computer resources demand. Huge amounts of data are recorded in a typical seismic survey for example, a 10x10 km 3-D seismic survey may require GB of memory and the processing algorithms are usually applied to a significant part of this data. For example, for modeling and migration, the wave field characteristics have to be calculated for each sampling time, along many space locations and probably many times. The more accurate the mathematical model is, the more comprehensive the information extracted from the data will be, but the more demanding on computer resources.
Therefore a trade-off has to be made between the accuracy of the numerical algorithm and the computational requirements. In this section we illustrate computer resources demand and its relation to the accuracy of solutions for seismic modeling and migration. Finite differences FD and ray-tracing, two modeling methods, are applied to that end. These two models define a range of seismic modeling and migration methods. FD is a technique used to solve actual numerical problems represented by partial differential equations, substituting infinitesimal differentials with finite differences.
The finite difference approximation, together with some numerical factors, defines the accuracy of the result. Through this method, it is possible to generate the complete wave-field, so as to enable studies that are hardly feasible through other methods such as to obtain the complete wave field, to establish complex geometrical settings, and to implement different wave equation approaches. An example of FD modeling of the wave-equation is presented in the next section. It uses a 2-D elastic wave algorithm developed by Levander In this algorithm the medium is represented as a grid of squares, and the geological model is defined by its geometrical and elastic properties.
Initial conditions are the source of energy and the boundary conditions. The entire wave-field is calculated sequentially throughout the geological model by time steps. In FD the computational efficiency is very sensitive to the size of the space grid matrix and to the number of time steps. Simultaneously, these parameters are related to the accuracy of the result: if the grid size and sampling time are not fine enough for the velocity field, the calculations can become unstable; also, insufficient sampling of short wavelength components causes dispersion, a numerical noise.
So, depending on the geological model, the computational cost increase with the physical accuracy. Figure 1 illustrates the result of an example using FD modeling method. The geological model for this example includes a low-velocity near-surface layer, which implies a very fine grid. Notice the number of seismic events and the fine gradation in amplitudes and wave-shapes. Compared to FD, ray tracing is a quite simple model of wave propagation, where the wave front, a real physical property, is represented by its normal, known as ray Fagin, With this method it is possible to obtain the path and time of travel for any specific trajectory, as defined by source and receiver locations.
However, other characteristics of the wave-field, such as amplitude and phase, result from independent calculations based on approximations. Figure 2 illustrates the results of a ray-tracing algorithm applied to the same model of Figure 1.
Developments in Petroleum Science
Comparing Figures 1 and 2 , we can observe a more complete wave-field resulting from the FD method. Also the dynamic characteristics, such as amplitudes and waveforms, have more fidelity in this method. However, there is a large difference in computational efficiency since FD requires more than times the computer time of ray tracing for the same model, using a single processing unit.
If the model were 3-D, probably one thousand times more machine resources would be required. Using parallel computers featuring one thousand processing units, the 3-D model would still require a couple of weeks to be fully processed. Migration methods also illustrate the limitations of ray-trace based algorithms. Figure 3 shows an example of two types of migration which are based on different wave propagation models. The first one, Figure 3 a, is based on a ray trace based method known as Kirchhoff migration.
Figure 3b corresponds to migration using a more complete model, based on the wave equation. The last solution is more demanding on computer resources and more accurate, as shown by the results. This type of solution is currently available due to simultaneous advances in computing and seismic processing methods. As computer power is a limiting factor for the use of more comprehensive solutions, wave propagation simulation -and especially finite differences- has been an object of interest for parallel computers applications.
For example, Fricke applies them to reverse time migration, and Myczkowski et al uses parallel computers for modeling purposes in a Connection Machine supercomputer. More recently, Villarreal and Scales developed algorithms for 3-D waves in acoustic media, using a network of Linux machines.
In the next section, we will explain a case of open-software libraries applied to exploration. As shown therein, open-software tools are integrated with commercial software and standard formats to develop a practical application tool for visualization of 3-D seismic data.
- Services on Demand?
- Fundamentals of Numerical Reservoir Simulation (Developments in Petroleum Science).
- Developments in Petroleum Science | Petroleum Geology | Petroleum Reservoir.
Visualization is the transformation of data into pictures. The picture can display the data directly, or can innovatively present its information content, which can give us an insight into its meaning Schroeder et al. Visualization is almost a mandatory tool, required to make sense out of the frequently overwhelming flood of information in today's world. An example is the information generated by the seismic method, especially in the case of 3D seismic data, which provides a complete image of earth volumes. As presented before, seismic data surveys are currently capable of absorbing hundreds of gigabytes in a few days, while seismic processing itself also produces lots of information.
Without visualization, most of this data would remain unseen, on computer disks and tapes, with no meaning or sense. Through the introduction of this tool, it is possible to use the best of our abilities, the vision system, to understand and analyze information. In the last few years, the software division of the company Numerica, has used free software programming tools to develop visualization software.
This investigation led to the implementation of sophisticated visualization tools that easily satisfy industry demands. These tools have been tested at the geophysics lab and the reservoir characterization group of the ICP Colombian Petroleum Institute. It is difficult to categorize visualization tools because most of them have a variety of functions covering many different applications, and many have overlapping functions. However many commercial and free visualization software packages can be identified. The most widely accepted and used is OpenGL.
In exploration geophysics, most simple targets have now been found and depleted. New prospects are structurally complex, requiring 3-D retrieval and imaging. The need to effectively build a structural and velocity model for 3-D imaging, and to effectively analyse the resulting image, has led to the need for more sophisticated viewing software. In the following section we will explain basic concepts of the OpenGL data model selected to represent the seismic information, as well as those models selected to represent the surfaces maps.
An example is illustrated in Figure 4. The data model consists of two parts: an organizing framework such as shape or geometry , and information associated with each element in the structure known as attribute data. The organization structure consists of points or cells such as lines, polylines, triangles, squares, tetrahedral, hexahedron, etc. This information can also be also classified as regular or irregular alternatively, structured or unstructured.