With the rapid increases of computer power and progresses in software
development, computer modeling is playing an ever important role
in the technological world covering a wide range of
fields, including drug design in biotech, control and
signal error corrections in
telecommunications (your modem at home that allows you to connect
to internet at speed higher than 600 bps, for example,
has to do such job to make sure signals don't screw up while
pattern/voice recognition, failure analysis,
and of course, materials
design. The essence of computer modeling is to minimize the error subject
to constraints, with
the error being defined through a set of rules or models. In the field of
materials science, this error is the deviation from the minimum
free energy (or internal energy at zero temperature) associated to
a set of macroscopic thermodynamics parameters (temperature, pressure,
number of particles). The modeling can be ab initio, i.e., defined
based on quantum mechanical principles, or empirical which are constructed
coarse-graining the microscopic degrees of freedom we understood and find
irrelevant for our purposes.
Two basic modeling tools are molecular dynamics and Monte-Carlo simulations.
Computational Researches at MSC
Here at MSC, Caltech, I am primarily engaged in developing new ab initio
density functional based molecular dynamics. The goal is
to have a computer software that works for all the elements
of the periodic table, that scales quasi linearly with the size of the
system so that large scale simulations are possible, and to be compatible
with other quantum chemistry methods so that the accuracy of density
functional calculations can be controled if needed. This research
is now in its final stage. We have already completed the critical part of
properly working electronic structure code
( Click here for a publication ).
I am currently working on implementing the Generalized Car-Parrinello
method for doing ab initio molecular dynamics.
Another computational method that interests me is the coarse graning
of atomistic Hamiltonian into macroscopic simulations. This is a very
important link that if succeded, will ultimately enables us to
predict macroscopic properties from first principles of quantum
mechanics. The result will be well controled engineering processes and
design. Right now, there is no theory guarantees that such link is possible.
That is, it is not yet mathematically clear if a macroscopic
Hamiltonian can be constructed from a microscopic one.
However, if we are willing to introduce the statistical/thermodynamics
into the process of such coarse-graining, then the answer, in my view,
has to be positive. The foundation of such assertion? See Feynman's
text book. The more subtle thing is, if we introduce thermal fluctuations
into the coarse-graining, how do we avoid double counting?
For the microscopic degrees of freedom, that is ok, since they
do not re-appear in macroscopic simulations. For the macroscopic
degrees of freedom, this is more problematic, and I think the
no uniqueness comes from here. It is prossible that such problem
can be solved with ``minimal action'' kind of method, but I haven't
gone through it yet.
The more tedious things is how to proceed.
Well, we have to define a set of thermodynamic variables.
We consider the small reservior, the one we simulate with microscopic
Hamiltonian to be linked with larger reservoir represented
with thermodynamic variables. We carried out coarse graining of
the equilibrium microscopic Hamiltonian subject to such boundary condition.
The resulting Hamiltonian is thus compatible with the larger reservoir.
Such process can be done in parallel spatially. The result is a macroscopic
simulation with microscopic accuracy subject to a finit set of
- To go back to electronic strure, please click
- Here are some
you may want check out
- To return to the home page or look other research activities,