Table Of ContentReport Documentation Page Form Approved
OMB No. 0704-0188
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,
including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington
VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it
does not display a currently valid OMB control number.
1. REPORT DATE 3. DATES COVERED
2000 2. REPORT TYPE 00-00-2000 to 00-00-2000
4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER
NAVO MSRC Navigator. Spring 2000
5b. GRANT NUMBER
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S) 5d. PROJECT NUMBER
5e. TASK NUMBER
5f. WORK UNIT NUMBER
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION
Naval Oceanographic Office (NAVO),Major Shared Resource Center REPORT NUMBER
(MSRC),1002 Balch Boulevard,Stennis Space Center,MS,39522
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)
11. SPONSOR/MONITOR’S REPORT
NUMBER(S)
12. DISTRIBUTION/AVAILABILITY STATEMENT
Approved for public release; distribution unlimited
13. SUPPLEMENTARY NOTES
14. ABSTRACT
15. SUBJECT TERMS
16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER 19a. NAME OF
ABSTRACT OF PAGES RESPONSIBLE PERSON
a. REPORT b. ABSTRACT c. THIS PAGE Same as 24
unclassified unclassified unclassified Report (SAR)
Standard Form 298 (Rev. 8-98)
Prescribed by ANSI Std Z39-18
FFuuttuurree
The
SShhaappeess ooff Director’s
Corner
WWhhaatt’’ss ttoo
Steve Adamec
CCoommee
NAVO MSRC Director
The title for this issue of the Navigator says it all–"Future computational capability, we have installed a Sun
Shapes of What’s to Come..." The NAVO MSRC is HPC10000 computational server with 64 processors and
undergoing a carefully planned series of enhancements 64 GB of shared memory—a unique resource which pri-
which, when completed in summer 2000, will provide marily supports interactive shared-memory HPC work
one of the most capable, productive, and balanced high and diagnosis/debugging work for large parallel applica-
performance computing (HPC) environments in the tions. The SGI Origin 2000 systems (now known formal-
world. These enhancements substantially boost the ly as the SGI 2800) have been merged and expanded to
MSRC's computational capabilities across the primary a single 256-processor system running a single IRIX
HPC architectures we support—distributed-memory par- image, greatly enhancing their capability to support
allel, shared memory, and parallel vector. The enhance- shared-memory Challenge project work. The Cray T932
ments also boost the mass storage and networking capa- will be retired at year's end, with two additional Cray
bilities of the MSRC as well. SV1 system nodes (for a total of four) being added to the
existing Cray SV1 system. The upgraded SV1 will be
The most successful and requested HPC system within configured with 4 gigawords (i.e., 32 GB) of shared
the NAVO MSRC, the Cray T3E, has been expanded by memory—the largest memory ever offered on a parallel
33 percent to 1,088 processors and approximately 400 vector system at the NAVO MSRC. Finally, the internal
GB of aggregate memory. A new IBM RS/6000 SP sys- mass storage and networking capabilities of the MSRC
tem with 1,336 POWER3/375 processors and 1.3 TB of have been completely reengineered for performance and
aggregate memory, one of the largest systems ever built resiliency, providing a perfect fit with our new OC-12
by IBM, is being installed as this issue goes to press. Both wide-area Defense Research and Engineering Network
the Cray T3E and the IBM SP are intended to provide (DREN) connectivity.
the premiere, high-end computational environment for
applications throughout Department of Defense (DoD) We invite you, the DoD user community, to let us assist
that simply cannot be run on lesser systems—applica- you in utilizing this cutting-edge capability in support of
tions that typically run as DoD challenge projects. To your HPC needs.
supplement this enormous distributed-memory parallel
About the Cover:
Future shapes of what’s to come...The US Army’s Theater High Altitude Area Defense Interceptor (THAAD) is a finless mis-
sile that utilizes ten liquid bi-propellant divert and attitude control jets for maneuvering during acquisition and homing of an
incoming target. Through extensive use of Computational Fluid Dynamics (CFD), researchers have been able to generate
engineering and aerodynamic data that would otherwise not have been available for the development and checkout of the
THAAD missile (see story on page 12).
2 SPRING 2000 NAVO MSRC NAVIGATOR
The Naval Oceanographic Office (NAVO) Contents
Major Shared Resource Center (MSRC):
Delivering Science to the Warfighter
The NAVO MSRC provides Department of Defense
(DoD) scientists and engineers with high perform- The Director’s Corner
ance computing (HPC) resources, including leading
edge computational systems, large-scale data stor-
age and archiving, scientific visualization resources 2 Future Shapes of What’s to Come
and training, and expertise in specific computational
technology areas (CTAs). These CTAs include Com-
putational Fluid Dynamics (CFD), Climate/Weather/ Feature Articles
Ocean Modeling and Simulation (CWO), Environ-
mental Quality Modeling and Simulation (EQM),
Computational Electromagnetics and Acoustics 4 AGlobal Ocean Nowcast/Forecast System
(CEA), and Signal/Image Processing (SIP).
7 Large-Scale Atom Simulation
NAVO MSRC
1002 Balch Boulevard
Stennis Space Center, MS 39522
1-800-993-7677 or Scientific Visualization
[email protected]
Summary of Links Found Inside 10 Dynamic Visualizations
http://www7320.nrlssc.navy.mil/global_nlom 12 Theater High Altitude Area Defense
/index.html
http://www.cclms.lsu.edu/cclms/research/
research-nanophase.html Programming Environment and Training
http://mesa3d.sourceforge.net
http://www.opendx.org/
http://www.ssec.wisc.edu/~billh/visad.html 14 FY 2000 PET Highlights
http://www.navo.hpc.mil/pet
14 Scientific Visualization Workshop
http://www.navo.hpc.mil/pet/viz2000
http://www.avs.com/products/AVS5/avs5.htm 15 PET Workshops
http://www.legion.virginia.edu/workshops.html 16 Tiger Teams
http://www.navo.hpc.mil/cgi-bin/pet/Training/
training.cgi/
http://www.sdsc.edu/PET/NAVO Computer Systems and Support
http://www.navo.hpc.mil/Tools/Queues/
http://www.navo.hpc.mil/cgipbin/pet/Training/
Resources/resources2.cgi?resource=avs 19 Supporting the Supercomputers—Part 1 of 2
www.siggraph.org/
www.sc00.org/
www.tuchemnitz.de/informatik/RA/cluster2000/ Application Challenges
NAVO MSRC Navigator
www.navo.hpc.mil/navigator 20 T3E Scalability
NAVO MSRC Navigator is a bi-annual technical pub-
lication to inform users of the news, events, people,
accomplishments, and activities of the Center. For a The Porthole
free subscription or to make address changes, con-
tact the editor.
EDITOR: 21 ALook Inside NAVO MSRC
Toni Hampton, [email protected]
DESIGNERS:
User Services
Brian Asbury, [email protected]
Patti Geistfeld, [email protected]
Lynn Yott, [email protected]
22 Navigator Tools and Tips
Any opinions, conclusions, or recommendations in
this publication are those of the author(s) and do not
necessarily reflect those of the Navy or NAVO Events
MSRC. All brand names and product names are
trademarks or registered trademarks of their respec-
tive holders. These names are for information pur- 23 Upcoming Events
poses only and do not imply endorsement by the
23 Recent Events
Navy or NAVO MSRC.
Approved for Public Release
Distribution Unlimited
NAVO MSRC NAVIGATOR SPRING 2000 3
A Global Ocean Nowcast/Forecast System
Harley E. Hurlburt and Alan J. Wallcraft, Naval Research Laboratory (NRL)
This project is aimed at developing an assimilation of altimeter data is greatly
ocean prediction system that will revo- enhanced by the 1/16° resolution of the
lutionize the ability to nowcast and Pacific system, as demonstrated by compari-
forecast the global ocean circulation down to son to corresponding results at coarser reso-
the scale of oceanic fronts and eddies. A lution. Other data, such as sea surface tem-
real-time demonstration for the Pacific perature and sparse vertical profiles of tem-
Ocean is currently running in real time at perature and salinity, will be assimilated as
NAVO MSRC with 1/16° resolution, a well.
demonstration that will be expanded to the
global ocean in the near future. Department BACKGROUND
of Defense (DoD) High Performance Com-
Ocean forecasting is in principle similar to
puting (HPC) resources and DoD HPC
atmospheric forecasting, but with two major
Challenge projects at NAVO MSRC and
Project: complications: (a) ocean eddies, at about
U.S. Army Engineer Research and Develop-
100 km across, are typically 20 to 30 times
Global and Basin-Scale Ocean ment Center (ERDC) MSRCs have been crit-
smaller than comparable atmospheric highs
Modeling and Prediction ical to this effort.
and lows which means that roughly four
orders of magnitude more computer time
INTRODUCTION
Principal Investigators:
and three orders of magnitude more com-
Harley E. Hurlburt DoD HPC and a DoD HPC Challenge grant puter memory are required; and (b) there
Alan J. Wallcraft on the NAVO MSRC SGI/Cray T3E are criti- are relatively few observations below the
E. Joseph Metzger cal components of a coordinated 6.1-6.4 ocean surface, so data assimilation is effec-
Naval Research Laboratory (NRL) effort on tively confined to using satellite observations
Robert C. Rhodes
the "Grand Challenge" problem of eddy- of the surface. The duration of forecast skill
Jay F. Shriver
resolving global ocean modeling and predic- for the ocean is not restricted to the 10- to
Charlie N. Barron
tion. The goal is a fully eddy-resolving, data 14-day limit for atmospheric highs and lows.
Naval Research Laboratory assimilative global ocean prediction system We have demonstrated at least 30-day pre-
Stennis Space Center, Mississippi with at least 1/16° (~7.5 km) resolution in dictive skill for ocean eddies and the mean-
the near term, with an upgrade to 1/32° dering of ocean currents and fronts, given
Ole Martin Smedstad when the available computer power is suffi- sufficient ocean model resolution and satel-
cient. A 1/16° system is currently running in lite altimeter data from TOPEX/POSEIDON
Jean-Francois Cayula
demonstration mode for the Pacific north of and ERS-2.
Planning Systems Incorporated
20°S, and expansion to the global ocean is A major component of NRL's ocean model-
planned in the near future using DoD HPC ing program has been a detailed study of the
A. Birol Kara Challenge resources at the NAVO MSRC resolution required for ocean prediction. We
Sverdrup Technologies and the NRL Layered Ocean Model have strong evidence that NLOM and other
(NLOM). ocean models (including all the popular
The 1/16° Pacific NLOM system already global and basin-scale ocean models) need
Assigned Site/System:
gives a real-time view of the ocean down to to use grid cells for each prognostic variable
NAVO MSRC SGI/Cray T3E the 50-200 km scale of ocean eddies and that are at most about 8 km across at mid-
the meandering of ocean currents and latitudes. Our research has shown that dou-
CTA: fronts, a view with unprecedented resolution bling the resolution to 4 km per cell gives
and clarity. This can be seen at the URL. substantial improvement but doubling again
Climate/Weather/Ocean Modeling
The system has demonstrated forecast skill to 2 km gives only modest additional
and Simulation
for a month or more for many ocean fea- improvement. Due to ocean modeler pref-
tures, including the fronts and eddies. The erence and choice of finite difference grid
URL: assimilation of satellite altimeter data into this design, there is significant variation in how
http://www7320.nrlssc.navy.mil/ system makes effective use of the near-real- such resolution is expressed in degrees, the
global_nlom/index.html time altimeter data from TOPEX/POSEI- most common way to describe ocean model
DON and ERS-2 that is available from resolution. For the NLOM grid it translates
Naval Oceanographic Office’s (NAVO- to 1/16°, 1/32°, and 1/64° resolution, respec-
CEANO's) Altimetry Data Fusion Center tively. This is for the global and basin-scale.
(ADFC). The effectiveness of the model Coastal models would use the global forecast
4 SPRING 2000 NAVO MSRC NAVIGATOR
for boundary conditions and would require 40N
15 January 1999 Nowcast
much smaller cells, but would cover only a
limited region.
35N
At 4 km, the optimal resolution is finer than
might be expected based on the size of
eddies. In relation to ocean eddy size, it is
similar to the resolution currently used by 30N
1/16° Pacific NLOM Sea Surface Height 14-day Forecast
the leading weather forecasting models in
40N
relation to the size of atmospheric highs and
15 January 1999 14-day Forecast from 1 Jan
lows. More specifically, our research has
shown that fine resolution of the ocean
35N
eddy scale is required to obtain coupling
between upper ocean currents and seafloor
topography via turbulent flow instabilities.
This coupling can strongly affect the path- 30N
ways of upper ocean currents and fronts, 1/8° MODAS Sea Surface Temperature Analysis
including the Gulf Stream in the Atlantic
40N
and the Kuroshio in the Pacific. The high 15 January 1999
resolution is also required to obtain sharp
fronts that span major ocean basins and a
nonlinear effect on the large scale C-shape 35N
of ocean gyres, such as the Sargasso Sea in
the Atlantic.
30N
130E 135E 140E 150E 155E 160E 165E 170E 175E 180
TECHNICAL APPROACH
Contour interval = 0.08° C
As far back as 1989, the President's Office Figure 1. Zoom on the Kuroshio current region south and east of Japan. (Top) Sea surface height
of Science and Technology recognized glob- (SSH) for 15 January 1999, from the NRL 1/16° Pacific model with assimilation of satellite altimeter
al ocean modeling and prediction as a data from TOPEX/POSEIDON and ERS-2. The altimeter tracks with data available for this update cycle
"Grand Challenge" problem, defined as are overlain. (Middle) Corresponding SSH snapshot from a 14-day forecast initialized from 1 January
requiring a computer system capable of sus- 1999. (Bottom) MODAS 1/8° SST analysis from satellite IRimagery. The SST color bar is designed to
highlight the Kuroshio pathway.
taining at least one trillion floating point
adds or multiplies per second. By taking a of this approach is that there is less need to surface data to subsurface fields. The statis-
multi-faceted approach to cost minimiza- increase the number of density tracking lay- tics are from an atmospherically forced 20-
tion, we are solving the problem on systems ers as the horizontal cell size is reduced. year inter-annual simulation of the same
capable of only a small percent of this per- With NLOM, halving the cell size requires ocean model, an application that requires a
formance. One facet is experiment about 8 times as much computer power (4 model with high simulation skill.
sequences that use the largest cell size possi- times from the number of cells plus 2 times
So far it has been possible to run NLOM in
ble and an ocean basin rather than the from the required smaller time step), but
demonstration mode with 1/32° resolution
entire globe whenever possible. This only with fixed-level ocean models, the number
globally (72ºS-65ºN) and 1/64° resolution
gets us so far, since in the end there is no of cells in the vertical should also be dou-
over the basin-scale subtropical Atlantic
substitute for small cells and a global bled, requiring about 16 times as much
(9ºN-51ºN), including the Caribbean and
domain. computer power.
Gulf of Mexico. While, at present, these
Another facet is the use of the NLOM which A third facet is a widely portable NLOM require greater computer resources than
has been specifically designed for eddy- computer code, developed under the Com- practical for an operational product, they do
resolving global ocean prediction. It is tens mon High Performance Computing Soft- give information on the value added of
of times faster than other ocean models in ware Support Initiative (CHSSI) program, increasing resolution and insight into model
computer time per model year for a given that targets large scalable computers with performance at 1/16° resolution.
horizontal resolution and model domain. high-speed network connections. NLOM
NLOM's performance is in turn due to a exhibits very good scalability (wall time RESULTS
range of design decisions, the most impor- speedup as more processors are used) on
In August 1999, we started running 1/16°
tant of which is the use of isopycnal (density such systems. For example, we have rou-
Pacific NLOM in near-real time (i.e., updat-
tracking) layers in the vertical rather than tinely run NLOM on up to 1,152 Cray T3E
ed every few days). In hindcast studies that
the more usual fixed depth cells. Density is processors at up to a sustained speed of
followed standard nowcast and forecast
the natural vertical coordinate system for about 100 billion useful floating-point oper-
procedures, but used data from a previous
the stratified ocean, and it allows seven ations per second.
time period, we compared 1/16° Pacific
NLOM layers to replace the 100 or more
A final facet of our efficiency drive is the use NLOM with 1/4° global NLOM. Both stud-
fixed levels that would be needed at 1/16°
of an inexpensive data assimilation scheme ies assimilated satellite altimeter data from
resolution. Another important advantage
backed by a statistical technique for relating TOPEX/POSEIDON and ERS-2, and then
NAVO MSRC NAVIGATOR SPRING 2000 5
month-long forecasts started from the data (FNMOC) Navy Operational Global Atmos- the ocean model and the atmospheric forc-
assimilative nowcast states were performed. pheric Prediction System (NOGAPS) and ing.
An example is shown in Figure 1 with com- European Centre for Medium-Range
These results indicate that NLOM SST is
parison to an independent 1/8° SST analy- Weather Forecast (ECMWF) winds over the
sufficiently accurate to be used as a plat-
sis from the Modular Ocean Data Assimil- time frame 1990-1998. In both cases
form for assimilation of SST data (which
tion System (MODAS) that is operational at ECMWF thermal fields were used because
has gaps due to cloudiness) and for SST
NAVOCEANO. The SST analysis shows such fields are not available from FNMOC
forecasting.
the correspondence between the Kuroshio prior to late 1997. However, the winds play
pathway and some of the eddies seen in a significant role in thermal forcing, and the Acknowledgments
the sea surface height field from the Pacific FNMOC winds were used for that purpose This work is a contribution to the 6.2 Basin-scale
Ocean model, a field observed by satellite in the FNMOC forced run. As before, there Ocean Prediction System project funded by the
altimetry as shown by overlain ground was no assimilation of SST data. In both Office of Naval Research under program element
tracks in Figure 1 (top panel). cases the median rms SST error was .84°C, 62435N and to the 6.4 projects Large Scale
Ocean Modeling and Ocean Data Assimilation
but outside the equatorial region the SST
The 1/16° Pacific model has given the very
funded by the Space and Naval Warfare Systems
results using FNMOC were noticeably bet-
first basin-wide skillful forecast demonstra- Command under program element 63207N. The
ter, .76°C vs. .84°C, using ECMWF, with
tion of oceanic fronts and eddies for any numerical model was run on SGI/Cray T3Es at
ocean basin. The Pacific results also demon- median correlation of .96 and .95, respec- the Naval Oceanographic Office, Stennis Space
strate that altimeter data alone are sufficient tively. Note the three years of daily SST in Center, Mississippi and at the U.S. Army Engi-
to produce an accurate nowcast when a Figure 2 show substantial differences in the neer Research and Development Center, Vicks-
shape of the annual cycle as well as shorter burg, Mississippi. Both are part of the Defense
high resolution ocean model is in the loop
time-scale variability that is captured by the Department's High Performance Computing Ini-
to fill in the space-time gaps in the altimeter
tiative.
NLOM SSTs, an indication of skill for both
data. The 1/4° global model was much less
successful in this feasibility demonstration,
Daily SST: NLOM vs buoy at 43°N, 130°W
as was expected from the earlier discussion
NLOM with no assimilation of SST
of resolution requirements. In addition, the
nowcast/forecast results at 1/16° resolution Buoy FNMOC winds ECMWF winds
were substantially better than expected for a
system that uses satellite altimeter data as RMS = 0.59°C 0.79°C 1994
the only observing system, due mainly to 18 ME = 0.39°C 0.23°C
doubts about the space-time resolution ade- C) R = 0.99 0.96
° 16 SS = 0.96 0.92
quacy of the altimeter data. SST assimila- (
T
tion will be added in the near future. S
S 14
The global results for SST have also exceed-
ed expectations, particularly for a model 12
with only seven layers in the vertical. The
10
embedded mixed layer in NLOM gives
accurate SST based on accurate atmos- RMS = 0.60°C 0.73°C 1995
18
pheric forcing even with no assimilation of ME = 0.31°C -0.05°C
SST data (or altimeter data). With climato- R = 0.99 0.97
logical atmospheric forcing, global NLOM °C) 16 SS = 0.96 0.94
(
gives SSTs accurate to within .5ºC for the T 14
S
annual mean and .7ºC for the seasonal S
cycle. Global NLOM at 1/2° and 1/8° reso- 12
lution was run 1979-98 with 6-12 hourly
10
atmospheric forcing from ECMWF and no
assimilation of SST data, and then com- RMS = 0.68°C 0.81°C 1997
pared to 337 year-long daily time series of 18 ME = 0.42°C 0.26°C
R= 0.98 0.96
observed SST around the world over the
16 SS = 0.94 0.92
1980-98 time frame. The median rms error C)
was .8 to .9ºC and the median correlation (°
coefficient was about .9, again with no ST 14
assimilation of SST data. The modal bin S 12
for rms error was .6 to .8°C, and the modal
bin for correlation was .95 to 1.0. 10
Jan Apr Jul Oct
Figure 2 shows similar results from 1/16° Figure 2. Daily observed SST from a NOAA buoy at 43°N, 130°W (black) and modeled SST from
Pacific NLOM, but with a comparison 1/16° Pacific NLOM forced with 6-12 hourly winds from FNMOC NOGAPS (red) and the ECWMF
between results using Fleet Numerical (blue) for 1994 (top), 1995 (middle), and 1997 (bottom). Thermal forcing (except the wind compo-
Meteorology and Oceanography Center nent) is from ECMWF in both cases. NLOMincluded no assimilation of SST data. Statistics include
root mean square error (RMS), mean error (ME), correlation (R), and skill score (SS).
6 SPRING 2000 NAVO MSRC NAVIGATOR
Large-Scale Atom Simulation
Dr. Rajiv K. Kalia, Louisiana State University
Novel materials that can with- DYNAMIC FRACTURE IN
stand high temperatures and CERAMICS AND
extreme environments are
NANOCOMPOSITES
generating considerable worldwide
attention. Such materials are Molecular dynamics simulations were
tremendously important for defense performed to investigate dynamic
technologies. The basic require- fracture in crystalline SiC and GaAs
ments for designing materials that at various temperatures using 256
have low densities, elevated melting nodes of Cray T3E at the NAVO
temperatures, high oxidation and MSRC. The simulations were carried
corrosion resistance, the ability to out on a thin-strip sample for which
resist creep, and high toughness the mechanical energy release rate,
encompass some of the most chal- G, can be calculated from the knowl-
Project:
lenging problems in materials sci- edge of the applied strain, e, and the
ence. value of the stress, s, far ahead of the Computational Assisted
Development of High Temperature
crack tip: G = Wse/2. In addition to
With this DoD Challenge grant, we
Structural Materials
the mechanical energy release rate,
have performed large-scale (106-108
the crack-tip velocity and local stress
atoms) molecular dynamics (MD)
Principal Investigators:
distribution at various temperatures
simulations to investigate dynamic
were monitored. Figure 1 shows the Rajiv K. Kalia
fracture in ceramics and nanocom-
results of 30-million-atom MD simu- Aiichiro Nakano
posites and dynamics of oxidation of
lations of SiC at room temperature Priya Vashishta
metallic nanoparticles. These simu-
and 1,500K. We find that large Concurrent Computing Laboratory
lations have been executed with
shear stresses close to the crack tip for Materials Simulations
highly efficient, portable and scal-
cause cleavage fracture at room tem- Department of
able, multiresolution algorithms
perature. At elevated temperatures, Physics and Astronomy
including the fast multipole method
dissipative effects due to dislocations, Department of Computer Science
for the long-range Coulomb interac-
micropore formation and coales- Louisiana State University
tion, a dynamic load-balancing
cence, and crack deflection cause Baton Rouge, Louisiana
scheme for mapping irregular appli-
stresses to spread out all over the
cations on parallel machines, and a
fractal-based compression scheme system, thereby increasing the value Assigned Site/System:
of G.
for scalable I/O and data communi- NAVO MSRC SGI Origin
cation. Similar effects are observed in our
100-million atom fracture simulations CTA:
Computational Chemistry and
300K Materials Science
Figure 1. Shear stress URL:
distributions in 30 million
http://www.cclms.lsu.edu/cclms/
atom MD simulations of
1500K SiC at 300K and 1500K. research/research-nanophase.html
0 1µµm
-5 0 5[GPa]
NAVO MSRC NAVIGATOR SPRING 2000 7
of GaAs. The MD results also indi- amorphous silica layers (see Figure that can successfully describe a wide
cate that the brittle-to-ductile transi- 3). Immersive visualization of these range of physical properties of both
tion temperature in GaAs (~ 600K) simulations reveals a rich diversity of metallic and ceramic systems. This
is close to the experimental value atomistic processes including fiber scheme includes changes in charge
(630-660K). Figure 2 shows snap- rupture, frictional pullout, and emis- transfer as the atoms move and is
shots of three different crack fronts in sion of molecular fragments, which thus capable of treating bond forma-
MD simulations of GaAs at room must all be taken into account in the tion and bond breakage. Dynamic
temperature. design of tough ceramic composites. charge transfer gives rise to compu-
tationally intensive Coulomb interac-
tion which, for the number of atoms
(a) (b) (c) necessary in a realistic simulation,
requires highly efficient algorithms
that map well onto parallel architec-
tures. The fast multipole method
(FMM) of Greengard and Rokhlin
was used to reduce the computation-
al complexity from O(N2) to O(N)
for the long-range Coulomb interac-
tion with extensions for stress calcu-
Figure 2. Snapshots of cracks in MD simulations of GaAs. lations. A multiple time-step algo-
Crack surfaces are (a) (110), (b) (111), and (c) (001). rithm further reduced the execution
time by an order of magnitude.
Both algorithms map very well onto
One of the most promising materials DYNAMICS OF OXIDATION
parallel architectures.
for high-temperature structural appli-
OF ALUMINUM
cations is a fiber-reinforced ceramic Dynamic oxidation simulations
NANOCLUSTERS
composite of silicon nitride host reveal a rapid three-step oxidation
embedded with fibers of silicon car- process culminating in a stable oxide
Dynamical oxidation simulations
bide. The fibers are coated with scale in 50 ps. In the first 5 ps, oxy-
were motivated by experiments that
materials that form weak interfaces gen molecules dissociate, and the
reveal unique electrical, optical, and
between fibers and the matrix. We oxygen atoms first diffuse into octa-
mechanical properties of nanocom-
have performed 10 million atom MD hedral and subsequently into tetrahe-
posites consisting of passivated Al
simulations to investigate the effect dral sites in the nanoparticle. In the
nanoparticles. This led us to per-
of interphase structure and residual next 20 ps, as the oxygen atoms dif-
form the first successful MD simula-
stresses on fracture toughness in a fuse radially into and the Al atoms
tion of oxidation of an Al nanocluster
silicon nitride matrix reinforced by diffuse radially out of the nanoparti-
using MD simulations based on a
silicon carbide fibers coated with cle, the fraction of six-fold coordinat-
highly reliable interaction scheme
Figure 3. (Left panel) Fractured silicon nitride (red) ceramic reinforced with silica-coated carbide fibers
(yellow). (Right panel) Close-up of the fractured composite system. Small spheres represent silicon
atoms, and large spheres represent nitrogen (green), carbon (magenta), and oxygen (cyan) atoms.
8 SPRING 2000 NAVO MSRC NAVIGATOR
ed oxygen atoms drops dramatically. stresses, fracture toughness and regions where atomic bonds are
Concurrently, there is a significant hardness, friction and adhesion, and formed or broken; molecular dynam-
increase in the number of O atoms material degradation due to oxida- ics (MD) simulations are carried out
that form isolated clusters of corner- tion in: (1) polydimethylsiloxane in non-linear regions surrounding the
sharing and edge-sharing OAl4 tetra- (PDMS) on alumina and silica sur- QM region; and the finite-element
hedra. Between 30 and 35 ps, clus- faces; (2) nanostructured composites (FE) approach with constitutive input
ters of OAl4 coalesce to form a neu- consisting of aluminum nanoparticles from QM or MD calculations is used
tral, percolating tetrahedral network passivated with oxygen; (3) in regions far away from the process
that impedes further intrusion of oxy- Al/Al2O3, Al/SiC, and Ti/TiO2 zones. The QM, MD, and FE
gen atoms into and of Al atoms out metal/ceramic interfaces; and (4) schemes are integrated with an
of the nanoparticle (see Figure 4). functionalized atomic-force micro- approach based on control theory.
As a result, a stable oxide scale is scope (AFM) tips of silicon nitride Algorithms will be designed to carry
formed. Structural analysis reveals a and carbon nanotubes. out these hybrid QM/MD/FE simula-
40 Å thick amorphous oxide scale on tions in DoD's metacomputing envi-
These Challenge applications require
the Al nanoparticle. The thickness ronment with multiple parallel
a methodology that can describe
and structure of the oxide scale are machines, mass storage devices, and
physical and chemical processes over
in accordance with experimental immersive and interactive virtual
several decades of length scales.
results. environments on a grid with high-
Quantum mechanical (QM) simula-
speed networks.
Currently we are planning multiscale tions based on the density function
simulations to determine residual theory (DFT) are performed in
Figure 4. Initial stage of oxidation of an A1 nanoparticle. Size distributions of OA14clusters at
20 ps (left) and 31 ps (right) are shown. Clearly, the clusters coalesce and percolate rapidly.
NAVO MSRC NAVIGATOR SPRING 2000 9