Scientific Research Results

Application of the Direct Simulation Monte Carlo Method to High Speed Rarefied Gas Flows

(Text, Image / Professor Wu Tsung-Hsin, Department of Mechanical Engineering, National Chiao Tung University / Luo Ming-Chung in the Doctoral Program)

Rarefied gas dynamics, including hypersonic fluid dynamics, vacuum pump technology, low-voltage semiconductor related material processes, and micron or nano scale gas dynamics, plays a crucial role in modern research on advanced technologies. The Boltzmann transport equation, which is able to describe the behavior of rarefied gas dynamics, is extremely hard to solve. The direct simulation Monte Carlo (DSMC) method is a particle simulation method. In statistics, if a large enough quantity of molecules is simulated, it has been mathematically proven to be the equivalent of solving the Boltzmann transport equation. However, it usually involves very high computational complexity, especially in transient flow and near continuous flow areas. Therefore, parallel computing using the DSMC method is indispensable to reducing computation time for effective application in rarefied gas flows.

The Aerothermal & Plasma Physics Laboratory, (APPL) of National Chiao Tung University used PDSC, its parallel DSMC program that is has been developing for over a decade, for flow simulation in EXPERT (European eXPErimental Re-entry Test-bed). Simulation results include the distribution of flow density, temperature, Mach number, object surface pressure, and heat flux, and will provide a basic database or analysis capabilities for future development of space vehicles. Due to the immense computational complexity of the DSMC method, we are grateful to the NCHC for providing computing resources and technical consultation that allowed the research to be successfully carried out.


Unknown Genomic Feature of Liver Cancer Discovered from In-depth Transcriptome Analysis

(Text, Image / Professor Huang Chi-Ying, Institute of Biomedical Informatics, National Yang Ming University)

Genomes generally refer to DNA and RNA, and then protein and carbohydrate. DNA in the nucleus is often viewed as the book of life, while RNA generally refers to all messages transcribed from DNA. In other words, RNA contains messages of life and controls all possible events in a cell, such as the protein required to generate a cell. Due to technology limitations in the past, genome research could only focus on about 2% or even smaller DNA segments. However, following the introduction of next generation sequencing (NGS) technology, more complete and comprehensive RNA information was obtained and expanded the scope of biomedical research.The research team at National Yang Ming University focuses on the analysis of liver cancer RNA sequencing data obtained from NGS, and utilizes the NCHC's supercomputers to analyze terabyte scale data. This enabled the team to be the first to discover a new RNA gene named DUNQU1 in the severely competitive global biomedical research community. This study not only expanded the traditional definition of cancer genomes, but also provides a new genomic feature and possible therapy target for the infamous liver cancer. This research result was published in the prestigious journal Oncogene.


Crowning Protein – Using Crown Ether to Control the Surface Characteristics of Protein

(Text, Image/Dr. Li Cheng-Chung, Dr. Manuel Maestre-Reyna, Academician Wang Hui-Chu, Academia Sinica)

Protein molecules have great potential for application in the field of biomedicine and chemistry, but its biochemical characteristics and mechanisms are hard to control, making application difficult. Protein structural analysis provides a good research platform for understanding the characteristics of protein. Protein structural identification is mainly achieved through X-ray crystallography, meaning that protein must first grow into a crystal for structural analysis. In order to break through bottlenecks of protein crystal growth, the research team led by Academician Wang Hui-Chun discovered that crown ether can be combined with the surface of multiple proteins to form protein-crown ether complexes. The research team simulated protein-crown ether complexes in water solutions using the NCHC's supercomputer to understand their binding in water solutions, and further using the characteristic of complexes more easily forming regular bonds than protein to overcome the difficulty of forming protein crystal structures. This research result will not only become an important technology for protein crystallography, but also create new opportunities for biological protein applications. This research result was published in the international journal Angewandte Chemie.

Heterogeneous State in Copper Oxide Superconductivity

(Text, Image / Li Ting-Kuo, Director of the Institute of Physics, Academia Sinica)

In research on high temperature superconductivity, researchers have always focused on the mechanism of superconductivity. We originally hoped that experiment observations would give us new understanding, but it is a shame that experiment results showed many different heterogeneous states under different doping concentrations. This made scientists guess that cooper oxide superconductivity is not like traditional superconductivity and can be clearly described using a BCS model (microscopic theory of superconductivity). This study hopes to prove the existence of many heterogeneous states, which have the same origin that is strongly correlated electrons. We used the renormalized t-J model (statistical model for the strongly correlated electrons system) for mean field diagonalization calculations, and found many heterogeneous states. Figure (a) shows one possible heterogeneous state in positive space, but their energy is still very close, meaning that microscopic observations will inevitably find many heterogeneous states their energy has degraded to a similar level. Furthermore, we also found that the symmetry of some heterogeneous states matches experiment data. Figure (b) shows the result of inherent symmetry after Fourier transform under the distribution in Figure (a). This makes us more confident in the reliability of the t-J model. In the future, we will seek to resolve the pesudogap before superconductivity, hoping to further understand the mechanism of high temperature superconductivity.


Simulation of Alkanolamines CO2 Capture Reaction

(Text, Image / Professor Tsai Ming-Kang)

Lowering the atmosphere's CO2 concentration is a key issue in resolving the greenhouse effect. Alkanolamines (such as: ethanolamine HO-CH2CH2-NH2) show high selectivity of CO2 in past applications, but had the disadvantage of pyrolysis, which obstructed the development of this liquid CO2 capture technology. In this study The excellent computing ability of the NCHC's supercomputer is used to simulate the nucleophilic reaction of CO2 and alkanolamines in liquid using a first principles algorithm. From the perspective of industrial chemical production processes, the structure of alkanolamines is designed and modified to improve their CO2 capture ability. Finally, molecular dynamics is used to verify the difference in CO2 capture before and after the design and modification. Findings of the study provide a reliable basis for molecular design in chemical synthesis. Further quantified analysis of the CO2 capture performance of alkanolamines in experiments also provides a direction worth considering.

EHTW Earth System Model

(Text, Image / Professor Chuang Ping-Chieh, Department of Environmental Engineering, National Chung Hsing University)

The EHTW (European Hamburg TaiWan) Earth System Model (ESM) is a an atmosphere-ocean coupled model developed by domestic scholars and unlike conventional couplers. There are currently no international organizations that provide medium to long-term (e.g. 45 days) atmosphere, ocean, and hydrological forecasts on a daily basis, and there has been no news that any research institutes are engaging in such an experiment. Hence, the establishment of a medium to long-term ocean, atmosphere, and hydrological forecasting system will be an important project to the international society. The project involves Taiwan's climate and average ground water level observations, water level database, soil water content module, establishes a regional scale forecasting system, and couples the WRF-Noah weather forecast system and Taiwan's locally developed surface hydrology module, thereby improving core technologies for the 45-day forecast system.

The NCHC's platform provides an international stage for working with the world's top research institutes, such as the High Resolution Atmospheric Model (HIRAM) of Dr. Lin Hsien-Chien at NOAA/Princeton GFDL and Professor Tseng Yu-Heng at NCAR; domestic research teams, such as the research group of Professor Hsu Huang-Hsiung of Academia Sinica, the Central Weather Bureau, National Taiwan University, National Central University, Chinese Culture University, and National Yunlin University of Science and Technology; inter-departmental collaboration in National Chung Hsing University, such as the Department of Computer Science and Engineering, Department of Applied Mathematics, Department of Civil Engineering, and Program of Landscape and Recreation; and the development of big data related products with the NCHC's environment laboratory and big image processing research team.


Antibody-Drug Conjugates

(Text, Image / Professor Wang Yen-Tseng, Kaohsiung Medical University)

Human beings are living longer as medicine has advanced, and they are facing many physical diseases that have drawn growing attention to drug development, in which Antibody-Drug Conjugates (ADCs) are target drugs commonly used to treat severe diseases, such cancer. ADCs currently have two severe flaws: (1) ADCs may become attached to normal cells and cause severe damage to normal cells: (2) After ADCs bond with specific cells, the entire ADCs compound must be devoured by the specific cell and become catalyzed by a specific protease to release ADCs and begin killing specific cells. However, the phagocytosis process is inefficient and often results in poor ACDs performance. Many research institutes around the world are actively developing ADCs. In the case of Centrose, it is developing and ADCs named Extracellular Drug Conjugates (EDCs).

EDCs have very similar functions to ADCs. EDCs are single clone protein antibodies that will bond with specific cells, such as tumor cells. The greatest difference with ADCs is that EDCs will not be treated as foreign objects by specific types of cells and devoured. In contrast, EDCs can perfectly pass a protein on the cell's surface, and greatly reduce the cell's drug resistance. At present, EDCs developed by Centrose can only kill cells of diseases at specific stages, but the company is expected to use EDCs to develop treatments for extremely hard to treat diseases such as pancreatic cancer. We hope that the constant development of new cancer drugs will benefit even more patients so that they are properly treated.


Palladium Nanotubes Have Excellent Hydrogen Storage Ability

(Text, Image / Professor Chu Hsun-Peng, National Sun Yat-Sen University)

The rapid development of nanomaterial manufacturing technologies has overturned many people's view of material characteristics. At a nano scale the characteristics of many materials is different from what is observed in bulk materials. Past studies found that some inactive materials are good catalysts at a nano scale and the materials have played important roles in energy technology applications. Many studies found that when nano particles are attached to nano scale thin film substrates or nanotubes, they have better hydrogen absorption capacity and increase the efficiency of hydrogen storage. Hence, our main research objective in recent years has been to use simulations to predict and analyze the characteristics of different nano structures.

The Molecular Engineering Laboratory of National Sun Yat-Sen University uses quantum computing to simulate the absorption of hydrogen molecules on palladium nanotubes, and found that it has excellent hydrogen storage capacity. The study increased the application of palladium nanotubes in the energy industry and the result was published in the international journal “Journal of Materials Chemistry”. Due to the massive amount of computation required for quantum mechanics simulations, a very long time is spent on computing. The NCHC's computing resources significantly reduces the amount of time required for simulations, and further increases research efficiency for studies to be successfully completed.


Climate Change Research in Taiwan

(Text, Image / Research Fellow Hsu Huang-Hsiung, Research Center for Environmental Changes, Academia Sinica and Dr. Shen Cheng-Yu, NCHC)

Climate change refers to the overall change of Earth's climate over a long period of time, and is a matter of concern to the sustainable development of different countries, as well as the survival of the human race. Climate change has a tremendous impact on Taiwan, which has an island-type climate, causing frequent short duration, heavy rainfall (i.e. hourly rainfall exceeds 100 mm) events that lead to flooding and debris flow. Hence, using computer simulations to understand the effect of global climate change on Taiwan and further reduce related disasters is a task of top priority.Starting in mid-2011, the Research Center for Environmental Changes of Academia Sinica collaborated with researchers from National Taiwan University, National Central University, and National Taiwan Normal University and formed the Consortium for Climate Change Study. The five-year project establishes key capabilities for simulating and interpreting Taiwan's climate change. Objectives of the Consortium for Climate Change Study are as follows: (1) To develop Taiwan's capability in model development, develop a climate system model that can be further improved locally, and provide it the research community for studies on climate variability and change; (2) To use the models to evaluate and project the potential impact of climate change on East Asian climate and monsoon and extreme weather (e.g. typhoon, torrential rain, and drought) in Taiwan.

Climate change projections involve the anslysis and research of simulation results for multiple different scenarios. Due to the high terrain and complex hydrology of Taiwan, there is high demand on temporal and spatial resolution for models. Hence, a massive amount of computing resources and storage space is needed for climate change research. The NCHC's computing resources and technical consultation not only increases the spatial resolution of simulations from 100 km X 100 km to 23 km X 23km, it also reduces the computing time needed in half, which greatly benefits climate change numerical simulations.


Quantum Fluctuations in the QCD Vacuum

(Text, Image/Professor Chao Ting-Wei, Department of Physics, National Taiwan University)

Understanding the vacuum of quantum chromodynamics (QCD) has tremendous impact on today's most challenging questions in physics and astronomy. QCD is the fundamental theory for the interaction between quarks and gluons. It provides the theoretical framework to understand the nuclear force and energy from the first principles, and also plays an important role in the evolution of the early universe. Using NCHC supercomputers, researchers have computed the most realistic simulations to date of the quantum fluctuations of the QCD vacuum. By comparing these simulations of the QCD vacuum with the experimental data from major laboratories, physicists can gain new insights into key features of the strong interaction at the subatomic scale, as well as guiding their search for new insights into how the early universe evolved from the quark-gluon plasma phase to the hadron phase.

To simulate the quantum fluctuations of the QCD vacuum, researchers have used the DWFQCD code developed by Professor Chao Ting-Wai and his research team at National Taiwan University. The code simulates the quantum fluctuations of quarks and gluons from the first principles of QCD, using the lattice fermion with optimal chiral symmetry. DWFQCD is very computational intensive, pushing the limit of computational methods, and requiring the fastest supercomputers. Since 2009, Professor Chao's research team has been taking advantage of the enormous floating-point computing power delivered by the Nvidia GPU, and using GPU clusters at the Department of Physics of NTU and NCHC to simulate QCD with dynamical quarks. The researchers ran the most advanced simulation ever of lattice QCD on NCHC's Formosa-5 GPU cluster, which incorporates the quantum fluctuations of (u,d,s,c) quarks and gluons, and provides scientists dynamical gauge configurations for understanding the hadron physics as well as the basic science of the early universe. The successes are mostly due to DWFQCD's world-leading high-precision chiral symmetry, capturing the most essential feature of quarks.

With these dynamical gauge configurations, researchers can probe the physics of the QCD vacuum using the low-lying eigenmodes of the quark matrix, since they encapsulate the important information to answer the crucial questions - how the proton gains its mass through the spontaneously chiral symmetry breaking of the QCD vacuum, as well as at what temperature in the early universe the chiral symmetry is restored. To make this research possible, NCHC ALPS supercomputer plays an important role, providing a platform for computing the low-lying eigenmodes of the quark matrix, which requires very-large scale computations.


1/12 Reactor Core Computation

(Text, Image/National Tsing Hua University Professor Chien Ching-Chang)

The development of a gas-cooled very high temperature reactor (VHTR) has always been an important option for next generation nuclear power plants, and there continue to be many studies on the flow field and temperature field of such reactors. Considering the high demand on nuclear safety, the analysis of the thermal hydraulic phenomena during accident scenarios has become more important than ever. The analysis of any core related accidents must be based on the normal conditions before the accident. The accuracy of simulations of normal core conditions is based on many key design parameters of the reactor. This study uses bypass flow as the basis for examining normal core conditions and verifies if calculations are reasonable. It then analyzes loss of flow accidents (LOFA). The research team began computations from a single channel that was gradually expanded to the 1/12 core section, and ensured that the 1/12 core section was validated to obtain more reliable computation results. The team found that analysis of natural convection in the core was clearly affected by the geometric shape and power in the model, and required analysis on a larger and more comprehensive scale. The team assumed both global and local symmetries to gain a detailed estimate of the strength of the resulting natural circulation, as well as the level of heat transfer, through a relatively large simulation using limited computing resources. Finally, the team successfully computed a more accurate natural convection using the validated 1/12 core model. Large scale 1/12 core section computation becomes even more important for LOFA analysis. The computation result was repeatedly validated by the NCHC's supercomputer, which significantly shortened the time required for complex computations of each case from 1-2 years to only about 2 months. Even so, the entire research took over a year to complete. In other words, without the NCHC's supercomputer, such a large scale CFD simulation would not exist!


Application of Molecular Simulation Technology in Organic Photovoltaic Cells

(Text, Image/Research Fellow Pao Chun-Wei, Research Center for Applied Sciences, Academia Sinica)

Photovoltaic cells are devices that convert solar energy into direct current through the photoelectric effect. The core materials generally used by PV cells are semiconductor materials. The development of organic photovoltaic (OPV) cells was driven by the demand on lighter and thinner electronic devices, as well as environmental issues. OPV cells are cheaper, portable, and flexible, and are thus able to be applied and mounted on different parts to provide power. OPV cells are PV cells entirely or partially made from organic materials, and use conducting polymers or small molecules to absorb light and transfer charge. The energy gap of organic molecules can be changed by changing the length and functional group of polymers. Organic compounds have very high molar extinction coefficient and very few organic compounds can absorb large amounts of light. At present, there are organic polymers with power conversion efficiency above 10%, but it is hard to control the molecular weight of polymer electron donor materials, so the efficiency of cells has been inconsistent, which has an extremely adverse effect on product commercialization. This study replaces polymers with small molecules as the electron donor material, and the acceptor material is still C60 or C70 derivatives. In the small molecule synthesis process, the precision of medium molecules is far higher than polymers, so the quality of PV cells is more even and there is relative little issue with product commercialization.

Research Center for Applied Sciences, Academia Sinica and a team from the Department of Materials Science and Engineering, National Tsing Hua University collaborated in the simulation of OPV cell solvent annealing and vacuum evaporation processes. This result uses molecular simulation technology to directly simulate the active layer structure of small molecule OPV cells in the vacuum evaporation and solvent process. This helps the experiment team optimize conditions for producing more efficient OPV cells. Parameters that can directly improve and increase the power conversion efficiency of OPV materials have been successfully found, and can directly help the booming domestic solar power industry.

Unlike the computer clusters set up in personal laboratories, this study effectively made a breakthrough from the system size constraints for traditional atomistic molecular simulation using the NCHC's high-performance computing cluster, allowing more direct comparison with experiment results. It provides a full view that was not available from experiments and increased the quality and quantity of research papers. The results have been published in prestigious journals such as Energy & Environmental Science (I.F.= 9.61) and JPCC (I.F.= 4.805).


Investigating the Suitable Materials for Photocatalytic Water Splitting

(Text, Image/Research Fellow Kuo Che-Lai, Institute of Atomic and Molecular Sciences, Academia Sinica)

The crisis of imminent global petroleum shortage has driven investigations into renewable energy resources that may serve as a replacement. Harvesting solar energy through photocatalytic water splitting is one of the methods believed to be feasible in the mass production of hydrogen gas.

Scientists believe that understanding the reaction mechanism and principles is extremely important to designing a stable and highly efficient solar power conversion system. In practice, many experiments are unable to effectively observe the detailed mechanisms on a molecular scale, but theoretical computational simulations can help us understand the reactions and changes during the entire process, including charge transfer, reaction energy barrier, and density of state. The increase in parallel computing performance today has is making complex theoretical computation models that are near experiment conditions more and more feasible, in which the team of Dr. Kuo Che-Lai is interested in the mechanisms of water splitting and hydrogen generation, and it is also investigating suitable materials for photocatalytic water splitting. The team has already conducted studies on water splitting on polar and non-polar GaN surfaces, as well as the effect of ZnO mixed with GaN on water splitting.

With the NCHC's support, Dr. Kuo Che-Lai's research team completed many tasks using over 20 million computing hours between 2012 and 2013. Some of the complex parallel computing models used up to 512 processors, and the team will continue to use the NCHC's excellent facilities for research.


Core-Collapse Supernova Simulation

(Text, Image/Kuo-Chuan Pan, Department of Physics and Institute of Astronomy, National Tsing Hua University)

The Core-Collapse supernova is the catastrophic exploration occurred at the end of the evolution of a massive star. It generates a powerful electromagnetic wave, neutrino and gravitational wave. It is also the origin of neutron stars and stellar black holes. While the detail theoretical understanding of the exploration mechanism is not yet fully understood, the mainstreaming theory believes that it is the interaction of neutrino and baryon that causes the exploration because of the extremely high density and pressure. During the late stellar evolution when the outward pressure due to the electron's degeneracy and the nuclear reaction in the core cannot withstand the gravity itself, the star violently collapses, compressing the core to the level of nuclear density, and bounces back producing shock waves. The multi-dimensional MHD (magneto-hydrodynamics) simulation with neutrino transfer has been used to study this exploration mechanism, which can also predict the associated electromagnetic wave, neutrino, and gravitational radiation.