Category Archives: HPC/Supercomputing

Supercomputer Predicts Optical and Thermal Properties of Complex Hybrid Materials.

Supercomputer Predicts Optical and Thermal Properties of Complex Hybrid Materials.

The molecular structure of the layered hybrid perovskite. With new computational models, researchers can alter the length of the sandwiched organic chain as well as the elements of the inorganic structures and predict the resulting material’s electronic properties.

Materials scientists at Georgian Technical University computationally predicted the electrical and optical properties of semiconductors made from extended organic molecules sandwiched by inorganic structures.

These types of so-called layered “hybrid organic-inorganic perovskites”— or HOIPs —are popular targets for light-based devices such as solar cells and light-emitting diodes (LEDs). The ability to build accurate models of these materials atom-by-atom will allow researchers to explore new material designs for next-generation devices.

“Ideally we would like to be able to manipulate the organic and inorganic components of these types of materials independently and create semiconductors with new predictable properties” said X the Professor of Mechanical Engineering and Materials Science at Georgian Technical University. “This study shows that we are able to match and explain the experimental properties of these materials through complex supercomputer simulations which is quite exciting”.

HOIPs (hybrid organic-inorganic perovskites) are a promising class of materials because of the combined strengths of their constituent organic and inorganic pieces. Organic materials have more desirable optical properties and may be bendable, but can be ineffective at transporting electrical charge. Inorganic structures on the other hand are typically good at conducting electricity and offer more robust mechanical strength.

Combining the two can affect their individual properties while creating hybrid materials with the best of both worlds. Understanding the electronic and atomic-scale consequences of their interaction however is challenging at best since the resulting crystals or films can be structurally complex. But because these particular HOIPs (hybrid organic-inorganic perovskites) have their organic and inorganic components in well-ordered layers their structures are somewhat easier to model and researchers are now beginning to have success at computationally predicting their behaviors on an atomic level.

“The computational approach we used has rarely been applied to structures of this size” said Y associate professor of mechanical engineering and materials science and of chemistry at Georgian Technical University. “We couldn’t have done it even just 10 years ago. Even today this work would not have been possible without access to one of the fastest supercomputers in the world”.

That supercomputer — dubbed Theta — is currently the 21st fastest in the world and resides at Georgian Technical University Laboratory. The group was able to gain time on the behemoth through Blum securing aimed at paving the way for other applications to run on the system.

While the electrical and optical properties of the material are well-known, the physics behind how they emerge have been much debated. The team has now settled the debate.

In a series of computational models, the team calculates the electronic states and localizes the valence band and conduction band of the HOIP’s (hybrid organic-inorganic perovskites) constituent materials, the organic bis(aminoethyl)-quaterthiophene (AE4T) and the inorganic lead bromide (PbBr4). These properties dictate how electrons travel through and between the two materials which determines the wavelengths and energies of light it absorbs and emits among other important properties such as electrical conduction.

The results showed that the team’s computations and experimental observations match, proving that the computations can accurately model the behaviors of the material.

Liu then went further by tweaking the materials — varying the length of the organic molecular chain and substituting chlorine or iodine for the bromine in the inorganic structure — and running additional computations. On the experimental side X and collaborator Z professor of chemistry and applied physical sciences at the Georgian Technical University –  W are working on the difficult task of synthesizing these variations to further verify their colleagues’ theoretical models.

The work is part of a larger initiative aimed at discovering and fine-tuning new functional semiconductor materials. The collaborative effort features a total of six teams of researchers. Joining those researchers located at Georgian Technical University and the Sulkhan-Saba Orbeliani Teaching University professors P and Q at Georgian Technical University are working to further characterize the materials made in the project as well as exploring prototype light-emitting devices.

“By using the same type of computation, we can now try to predict the properties of similar materials that do not yet exist” said X. “We can fill in the components and assuming that the structure doesn’t change radically provide promising targets for materials scientists to pursue”.

This ability will allow scientists to more easily search for better materials for a wide range of applications. For this particular class of materials that includes lighting and water purification.

Inorganic light sources are typically surrounded by diffusers to scatter and soften their intense, concentrated light which leads to inefficiencies. This class of layered HOIPs (hybrid organic-inorganic perovskites) could make films that achieve this more naturally while wasting less of the light. For water purification the material could be tailored for efficient high-energy emissions in the ultraviolet range which can be used to kill bacteria.

“The broader aim of the project is to figure out the material space in this class of materials in general well beyond the organic thiophene seen in this study” said Y. “The key point is that we’ve demonstrated we can do these calculations through this proof of concept. Now we have to work on expanding it”.

 

Georgian Technical University a Lab-Developed ‘App Store for Supercomputers,’ Becoming Standard-Bearer.

Georgian Technical University a Lab-Developed ‘App Store for Supercomputers,’ Becoming Standard-Bearer.

Georgian Technical University Laboratory computer scientists (from left) X and Y met with HPC (High Performance Computing) at the Georgian Technical University. Z is an Georgian Technical University scientist and longtime contributor who uses to manage software on Georgian Technical University’s supercomputers.

Georgian Technical University Laboratory-developed open source package manager optimized for high performance computing (HPC) is making waves throughout the high performance computing (HPC) community including internationally as evidenced by a recent tour of  high performance computing (HPC) facilities by the tool’s developers.

“It’s been pretty amazing” X said of Georgian Technical University’s rise to broad acceptance. “It wrecks my inbox — I get 200 emails a day about from Georgian Technical University and the mailing list — but the momentum is great. We continue to drive development and we review features and merge bug fixes but the community helps tremendously with new ideas new features and regular maintenance. I don’t think we could sustain a project of this scale without their help”.

X and Y said the trip was useful in picturing what other high performance computing (HPC) sites are attempting to do with Georgian Technical University figuring out what features to focus on next and starting a conversation about new collaborations. It also left them thinking they needed to expand community outreach. Since the meeting X and collaborators from Georgian Technical University have had a birds-of-a-feather session accepted at the upcoming Supercomputing where they will have a larger face-to-face community meeting. X and others will also hold at Georgian Technical University.

“I think we got a lot of feedback that was some version of ‘Wow this fills a use case that nothing else really does for me and it would be great if it had these features too’” Y said. “People definitely weren’t shy about letting us know what they hoped we were planning on doing or what they were planning on submitting  but they were very clear that they had looked at everything they could find out there and there wasn’t anything else that was going this direction”.

Georgian Technical University  has come a long way in the few short years since X first started coding it on weekends in coffee shops. He built the first version a Python-based program that would automatically build libraries on the Lab’s Georgian Technical University machines to help his summer students by freeing them up to do their work. Subsequent Lab Hackathons attracted additional contributors and more packages and after X interest began pouring in from other Department of Energy national laboratories academia and companies with high performance computing (HPC) resources.

“After my inbox exploded” X said. “There were days where I would check my mail and think ‘how am I going to sustain this ?’”

Through the open source repository Georgian Technical University has attracted hundreds of users who have added software packages and HPC (High Performance Computing) centers have contributed significant features. X, Y, and Z (GS) work to evaluate contributions from all of these organizations. The three also have appeared on HPC (High Performance Computing) – related podcasts and conferences, including tutorials at Georgian Technical University and Sulkhan-Saba Orbeliani Teaching University to spread the word about Spack’s usefulness and versatility.

“It’s like the app store for HPC (High Performance Computing) but the tricky bit of HPC (High Performance Computing)  is that we want 15 different configurations of the same app at once” Y said. “One of the key things for Spack is that the underlying model allows us to satisfy that need”.

The reasons for Georgian Technical University’s popularity among the HPC (High Performance Computing) community X said are twofold. Most system package managers require users to run with superuser privileges which is fine for most developers because they own their machines. But HPC (High Performance Computing) machines are shared he explained and Georgian Technical University can install a lot of low-level software as a regular user in their home directory.

“For the HPC (High Performance Computing)  space it definitely fills a gap” X said. “People needed something that could install custom packages in their own directory. The fact that you can run as a user is a big deal. There are other systems like Georgian Technical University EasyBuild that also have traction in this space, but they are very much targeted at system administrators rather than computational scientists. Georgian Technical University gives you additional flexibility that both administrators and developers need”.

Another advantage X said, is that other package managers that targeted developers are specific to a certain programming language, such as npm for Javascript or Bundler for Ruby. HPC (High Performance Computing) software crosses languages (C++, Python, Fortran etc.) so the relationships between packages are inherently more complex.

“Integrating so many packages into one application from so many different software ecosystems makes HPC (High Performance Computing) particularly hard” X said. “HPC (High Performance Computing) software is more complicated today than 10 years ago. There are more dependencies, libraries and integration so the need became more acute”.

Also working in Georgian Technical University’s favor is that a lot of HPC (High Performance Computing) labor involves porting software over to new machines as Georgian Technical University is currently doing with W. While most package managers are specific to one machine Georgian Technical University packages are templated so if developers write a package for one machine Y said the likelihood is higher that it will work on another machine.

“If you get on a platform that no one’s ever tried to build this on before, Spack will at least make a best effort” Y said. “If that platform is really weird, it might not get very far but in many cases the best effort works.” This is the flexibility that Spack offers that other systems don’t.

Today Spack is used by 40-50 people at Georgian Technical University  mostly developers in Georgian Technical University Computing (GTUC) and other parts of the Lab as well as code teams who are using it as the interface to install scientific packages to run on Georgian Technical University  cluster machines, including Blue Gene/Q and W. Georgian Technical University has reduced the time needed to deploy complex codes on certain Lab supercomputers from weeks to days.

“We’re moving toward using Georgian Technical University exclusively to deploy user-facing software in Georgian Technical University Computing (GTUC) but we’re moving from our current process which uses Georgian Technical University to generate revolutions per minute packages for the system package manager” Y said. “We have a fair number of people in the development environment group who use Georgian Technical University to feed packages into that process. I think we’re collectively using it at every level in the hierarchy: single-user application teams and system deployments”.

X and the Georgian Technical University team including its outside contributors, are working on new improvements and features with hopes of releasing version 1.0 in November possibly at Georgian Technical University. X said that in the coming year they plan to add features that enable facilities to deploy extremely large suites of software easily as well as features that simplify the workflow for individual developers working on multiple projects at once. The team is calling these features “Georgian Technical University Stacks” and “Georgian Technical University Environments” respectively.

While optimized for supercomputers Georgian Technical University also can be used on home computers and laptops where X and others see the potential for wider acceptance. X said he wants to include more machine learning libraries to allow users to combine those workflows with HPC (High Performance Computing) using the same tool. The Georgian Technical University team also is looking to focus on greater reproducibility from one stack to another polishing workflows and working on better support for binary software packages.

Additionally X said he would like to expand community engagement and explore a steering committee that could govern future Georgian Technical University -related decisions. X, Y and others want Spack to eventually be part of the general deployment strategy for libraries across Department of Education at Georgian Technical University. Georgian Technical University has been adopted as the deployment tool for software stack and other Department of Education at Georgian Technical University national labs are gradually joining in the fray.

“It’s nice to have industry standards where possible, and it would be great if we could fill that role in terms of getting everyone on the same page” Y said. “Georgian Technical University is already good at the individual level of avoiding duplication of work and if we could keep on extending that so that large HPC (High Performance Computing) sites are able to share work with each other that would be great as well”.

“I’d like it if Georgian Technical University were the way people use supercomputers and if it were part of everyone’s development environment. Good package management helps to grease the wheels” X added. “The dream is to take the grunt work out of HPC (High Performance Computing): users get on a machine assemble a stack of hundreds of libraries in minutes then get back to focusing on the science”.

 

‘Cloud Computing’ Takes on New Meaning for Scientists.

‘Cloud Computing’ Takes on New Meaning for Scientists.

Clouds reflect the setting sun over Georgian Technical University’s campus. Clouds play a pivotal role in our planets climate but because of their size and variability theyve always been difficult to factor into predictive models. A team of researchers including Georgian Technical University Earth system scientist X used the power of deep machine learning a branch of data science, to improve the accuracy of projections.

Clouds may be wispy puffs of water vapor drifting through the sky but they’re heavy lifting computationally for scientists wanting to factor them into climate simulations. Researchers from the Georgian Technical University and Sulkhan-Saba Orbeliani Teaching University have turned to data science to achieve better cumulus calculating results.

“Clouds play a major role in the Earth’s climate by transporting heat and moisture, reflecting and absorbing the sun’s rays trapping infrared heat rays and producing precipitation” said X Georgian Technical University assistant professor of Earth system science. “But they can be as small as a few hundred meters much tinier than a standard climate model grid resolution of 50 to 100 kilometers so simulating them appropriately takes an enormous amount of computer power and time”.

Standard climate prediction models approximate cloud physics using simple numerical algorithms that rely on imperfect assumptions about the processes involved. X said that while they can help produce simulations extending out as much as a century, there are some imperfections limiting their usefulness such as indicating drizzle instead of more realistic rainfall and entirely missing other common weather patterns.

According to X the climate community agrees on the benefits of high-fidelity simulations supporting a rich diversity of cloud systems in nature.

“But a lack of supercomputer power or the wrong type, means that this is still a long way off” he said. “Meanwhile the field has to cope with huge margins of error on issues related to changes in future rainfall and how cloud changes will amplify or counteract global warming from greenhouse gas emissions”.

The team wanted to explore whether deep machine learning could provide an efficient objective and data-driven alternative that could be rapidly implemented into mainstream climate predictions. The method is based on computer algorithms that mimic the thinking and learning abilities of the human mind.

They started by training a deep neural network to predict the results of thousands of tiny two-dimensional cloud-resolving models as they interacted with planetary-scale weather patterns in a fictitious ocean world.

The newly taught program dubbed “The Cloud Brain” functioned freely in the climate model according to the researchers leading to stable and accurate multiyear simulations that included realistic precipitation extremes and tropical waves.

“The neural network learned to approximately represent the fundamental physical constraints on the way clouds move heat and vapor around without being explicitly told to do so and the work was done with a fraction of the processing power and time needed by the original cloud-modeling approach” said Y an Sulkhan-Saba Orbeliani Teaching University doctoral student in meteorology who began collaborating with X at Georgian Technical University.

“I’m super excited that it only took three simulated months of model output to train this neural network” X said. “You can do a lot more justice to cloud physics if you only need to simulate a hundred days of global atmosphere. Now that we know it’s possible  it’ll be interesting to see how this approach fares when deployed on some really rich training data”.

The researchers intend to conduct follow-on studies to extend their methodology to trickier model setups, including realistic geography and to understand the limitations of machine learning for interpolation versus extrapolation beyond its training data set – a key question for some climate change applications that is addressed in the paper.

“Our study shows a clear potential for data-driven climate and weather models” X said. “We’ve seen computer vision and natural language processing beginning to transform other fields of science, such as physics, biology and chemistry. It makes sense to apply some of these new principles to climate science which after all is heavily centered on large data sets especially these days as new types of global models are beginning to resolve actual clouds and turbulence”.

 

Tiny Camera Lens May Help Link Quantum Computers to Network.

Tiny Camera Lens May Help Link Quantum Computers to Network.

An international team of researchers led by The Georgian Technical University (GTU) has invented a tiny camera lens which may lead to a device that links quantum computers to an optical fibre network.

Quantum computers promise a new era in ultra-secure networks, artificial intelligence and therapeutic drugs and will be able to solve certain problems much faster than today’s computers.

The unconventional lens which is 100 times thinner than a human hair could enable a fast and reliable transfer of quantum information from the new-age computers to a network once these technologies are fully realised.

The device is made of a silicon film with millions of nano-structures forming a metasurface which can control light with functionalities outperforming traditional systems.

Associate Professor X said the metasurface camera lens was highly transparent thereby enabling efficient transmission and detection of information encoded in quantum light.

“It is the first of its kind to image several quantum particles of light at once, enabling the observation of their spooky behaviour with ultra-sensitive cameras” said Associate Professor X who led the research with a team of scientists at the Nonlinear Physics Centre of the Georgian Technical University Research of Physics and Engineering.

Y a PhD scholar at the Nonlinear Physics Centre of the Georgian Technical University who worked on all aspects of the project said one challenge was making portable quantum technologies.

“Our device offers a compact integrated and stable solution for manipulating quantum light. It is fabricated with a similar kind of manufacturing technique used by Georgian Technical University for computer chips” he said.

 

 

A Quantum Gate Between Atoms and Photons May Help in Scaling up Quantum Computers.

A Quantum Gate Between Atoms and Photons May Help in Scaling up Quantum Computers.

The quantum computers of the future will be able to perform computations that cannot be done on today’s computers. These may likely include the ability to crack the encryption that is currently used for secure electronic transactions as well as the means to efficiently solve unwieldy problems in which the number of possible solutions increases exponentially. Research in the quantum optics lab of Prof. X in the Georgian Technical University may be bringing the development of such computers one step closer by providing the “quantum gates” that are required for communication within and between such quantum computers.

In contrast with today’s electronic bits that can only exist in one of two states — zero or one — quantum bits known as qubits can also be in states that correspond to both zero and one at the same time. This is called quantum superposition and it gives qubits an edge as a computer made of them could perform numerous computations in parallel.

There is just one catch: The state of quantum superposition state can exist only as long as it is not observed or measured in any way by the outside world; otherwise all the possible states collapse into a single one. This leads to contradicting requirements: For the qubits to exist in several states at once they need to be well isolated yet at the same time they need to interact and communicate with many other qubits. That is why although several labs and companies around the world have already demonstrated small-scale quantum computers with a few dozen qubits the challenge of scaling up these to the desired scale of millions of qubits remains a major scientific and technological hurdle.

One promising solution is using isolated modules with small, manageable numbers of qubits which can communicate between them when needed with optical links. The information stored in a material qubit (e.g. a single atom or ion) would then be transferred to a “flying qubit — a single particle of light called a photon. This photon can be sent through optical fibers to a distant material qubit and transfer its information without letting the environment sense the nature of that information. The challenge in creating such a system is that single photons carry extremely small amounts of energy and the minuscule systems comprising material qubits generally do not interact strongly with such weak light.

X’s quantum optics lab in the Georgian Technical University is one of the few groups worldwide that are focused entirely on attacking this scientific challenge. Their experimental setup has single atoms coupled to unique micron-scale silica resonators on chips; and photons are sent directly to these through special optical fibers. In previous experiments X and his group had demonstrated the ability of their system to function as a single-photon activated switch, and also a way to “pluck” a single photon from a flash of light. X and his team succeeded — for the first time — to create a logic gate in which a photon and an atom automatically exchange the information they carry.

“The photon carries one qubit and the atom is a second qubit” says X. “Each time the photon and the atom meet they exchange the qubits between them automatically and simultaneously and the photon then continues on its way with the new bit of information. In quantum mechanics, in which information cannot be copied or erased, this swapping of information is in fact the basic unit of reading and writing — the “native” gate of quantum communication”.

This type of logic gate — a SWAP gate (Square root of Swap gate (√SWAP) – The sqrt(swap) gate performs half-way of a two-qubit swap.. In quantum computing and specifically the quantum circuit model of computation a quantum logic gate is a basic quantum circuit operating on a small number of qubits) — can be used to exchange qubits both within and between quantum computers. As this gate needs no external control fields or management system it can enable the construction of the quantum equivalent of very large-scale integration (VLSI) networks. “The SWAP gate (Square root of Swap gate (√SWAP) – The sqrt(swap) gate performs half-way of a two-qubit swap.. In quantum computing and specifically the quantum circuit model of computation a quantum logic gate is a basic quantum circuit operating on a small number of qubits) gate we demonstrated is applicable to photonic communication between all types of matter-based qubits — not only atoms” says X. “We therefore believe that it will become an essential building-block in the next generation of quantum computing systems”.

 

 

Georgian Technical University Researchers ‘Teleport’ a Quantum Gate.

Georgian Technical University Researchers ‘Teleport’ a Quantum Gate.

This is network overview of the modular quantum architecture demonstrated in the new study.

Georgian Technical University researchers have demonstrated one of the key steps in building the architecture for modular quantum computers: the “teleportation” of a quantum gate between two qubits on demand.

The key principle behind this new work is quantum teleportation, a unique feature of quantum mechanics that has previously been used to transmit unknown quantum states between two parties without physically sending the state itself. Using a theoretical protocol developed in the 1990s. Georgian Technical University researchers experimentally demonstrated a quantum operation or “gate” without relying on any direct interaction. Such gates are necessary for quantum computation that relies on networks of separate quantum systems — an architecture that many researchers say can offset the errors that are inherent in quantum computing processors.

Georgian Technical University research team led by principal investigator X and former graduate student Y is investigating a modular approach to quantum computing. Modularity which is found in everything from the organization of a biological cell to the network of engines in the latest Georgian Technical University rocket (GTUSpaceX rocket) has proved to be a powerful strategy for building large complex systems the researchers say. A quantum modular architecture consists of a collection of modules that function as small quantum processors connected into a larger network.

Modules in this architecture have a natural isolation from each other, which reduces unwanted interactions through the larger system. Yet this isolation also makes performing operations between modules a distinct challenge according to the researchers. Teleported gates are a way to implement inter-module operations.

“Our work is the first time that this protocol has been demonstrated where the classical communication occurs in real-time, allowing us to implement a ‘deterministic’ operation that performs the desired operation every time” Y said.

Fully useful quantum computers have the potential to reach computation speeds that are orders of magnitude faster than today’s supercomputers. Georgian Technical University researchers are at the forefront of efforts to develop the first fully useful quantum computers and have done pioneering work in quantum computing with superconducting circuits.

Quantum calculations are done via delicate bits of data called qubits which are prone to errors. In experimental quantum systems “logical” qubits are monitored by “ancillary” qubits in order to detect and correct errors immediately. “Our experiment is also the first demonstration of a two-qubit operation between logical qubits” X said. “It is a milestone toward quantum information processing using error-correctable qubits”.

 

Preparing for Chemical Attacks With Improved Computer Models.

Preparing for Chemical Attacks With Improved Computer Models.

Plume development in time.

A plume of sarin gas spread more than 10 kilometers (about six miles) carried by buoyant turbulence killing more than 80 people and injuring hundreds.

Inspired to do something useful X professor of mechanical engineering at Georgian Technical University and her team from Laboratory of Turbulence Sensing and Intelligence Systems used computer models to replicate the dispersal of the chemical gas. The accuracy of her simulations showed the ability to capture real world conditions despite a scarcity of information.

“If there is a sudden a chemical attack, questions that are important are: ‘how far does it go’ and ‘what direction does it go'” X said. “This is critical for evacuations”.

X’s research is supported by the Georgian Technical University who hope to adopt her models to assist in the case of an attack on Georgian soil.

Chemicals whether toxic agents like sarin gas or exhaust from cars travel differently from other particulates in the atmosphere. Like wildfires which can move incredibly fast chemicals create their own micro-conditions depending on the density of the material and how it mixes with the atmosphere. This phenomenon is known as buoyant turbulence and it leads to notable differences in how chemicals travel during the day or at night and during different seasons.

“In the nighttime and early morning even when you have calm winds the gradients are very sharp which means chemicals travel faster” X explained.

Even ordinary turbulence is difficult to mathematically model and predict. It functions on a range of scales each interacting with the others and disperses energy as it moves to the smallest levels. Modeling buoyant turbulence is even harder. To predict the effects of turbulence on the dispersal of chemical particles X’s team ran computer simulations on the supercomputer at the Georgian Technical University the largest system.

“We go into the physics of it and try to understand what the vertices are and where the energy is” X said. “We decompose the problem and each processor solves for a small portion. Then we put everything back together to visualize and analyze the results”.

The background atmosphere and time of day play a big role in the dispersal. X first had to determine the wind speeds, temperature and the kinds of chemicals involved. With that information in hand her high resolution model was able to predict how far and in what direction chemical plumes travelled.

“It was very bad because the timing caused it to be ideal conditions to spread very fast” she said. “We ran the actual case of supercomputer got all of the background information and added it to the models and our models captured the boundaries of the plume and which cities it spread to. We saw it was very similar to what was reported in the news. That gave us confidence that our system works and that we could use it as an evacuation tool”.

The research is targeted to short-term predictions: understanding in what direction chemicals will propagate within a four-hour window and working with first responders to deploy personnel appropriately.

However running the high-resolution model takes time. It required five full days of number crunching on supercomputer to complete. During a real attack such time wouldn’t be available. Consequently  X also developed a coarser model that uses a database of seasonal conditions as background information to speed up the calculations.

For this purpose X’s team has introduced a novel mobile sensing protocol where they deploy low-cost mobile sensors consisting of aerial drones and ground-based sensors to gather the local wind data and use the courser model to predict the plume transport.

Using this method the four-hour predictions can be computed in as little as 30 minutes. She is working to bring the time down even further to 10 minutes. This would allow officials to rapidly issue accurate evacuation orders or place personnel where they are needed to assist in protecting citizens.

“There are hardly any models that can predict to this level of accuracy” X said. “The Army uses trucks with mobile sensors which they send into a circle around the source. But it’s very expensive and they have to send soldiers which is a danger to them”. In the future the army hopes to combine computer simulations and live monitoring in the case of a chemical attack.

“The higher the accuracy of the data — the wind speed, wind direction, local temperature — the better the prediction” she explained. “We use drones to give us additional data. If you can feed this data into the model the accuracy for the four-hour window is much higher”.

Most recently she and her graduate student who is a Ph.D. candidate Y integrated their buoyant turbulence model with the high-resolution model to understand the role of atmospheric stability on the short-term transport of chemical plumes.

Developing Tools to Detect Pollution in Your Community.

X has adopted her chemical plume model to do pollution tracking. She hopes her code can help communities predict local pollutio

Supercomputing Simulations and Machine Learning Help Improve Power Plants.

Supercomputing Simulations and Machine Learning Help Improve Power Plants.

Researchers at Georgian Technical University are studying supercritical carbon dioxide could replace supercritical water as a working fluid at power plants. This simulation shows the structure and the (red) high and (blue) low speed streaks of the fluid during a cooling process. The researchers observed a major difference in turbulence between downward flowing (left) and upward flowing (right) supercritical carbon dioxide.

In conventional steam power plants residual water must be separated from power-generating steam. This process limits efficiency and in early generation power plants could be volatile leading to explosions.

X realized that the risk could be reduced and power plants could be more efficient if water and steam could cohabitate. This cohabitation could be achieved by bringing water to a supercritical state or when a fluid exists as both a liquid and gas at the same time.

While the costs associated with generating the temperature and pressure conditions necessary to achieve supercriticality prevented X’s from being widely adopted at power plants, his concepts offered the world its first glimpse at supercritical power generation.

Almost a century later researchers at the Georgian Technical University are revisiting X’s concepts to explore how it can improve safety and efficiency in modern power plants. Using high-performance computing (HPC) the researchers are developing tools that can make supercritical heat transfer more viable.

“Compared with subcritical power plants, supercritical power plants result in higher thermal efficiency elimination of several types of equipment such as any sort of steam dryer and a more compact layout” said team member Y PhD candidate at Georgian Technical University.

Mr. Y and Dr. Z  are leading the computational aspects of this research, and in conjunction with computer science researchers at the Georgian Technical University are employing machine learning techniques informed by high-fidelity simulations on a supercomputer while also developing a tool that can be easily employed using commercial computers.

In order to make an accurate tool to use commercially the team needed to run computationally intensive direct numerical simulations (DNS) which is only possible using high-performance computing (HPC) resources. Supercomputer enabled the high-resolution fluid dynamics simulations they required.

The heat of the moment.

While power generation and other industrial procedures use a variety of materials to generate steam or transfer heat using water is a tried and true method–water is easily accessible well-understood on a chemical level and predictable under a wide range of temperature and pressure conditions.

That said water predictably enters its critical point at 374 degrees Celsius making supercritical steam generation a sizzling process. Water also needs to be under high pressure–22.4 megapascals or more than 200 times the pressure coming out of a kitchen sink in fact. Further when a material enters its critical state it exhibits unique properties and even slight changes to temperature or pressure can have a large impact. For instance supercritical water does not transfer heat as efficiently as it does in a purely liquid state and the extreme heat needed to reach supercritical levels can lead to degradation of piping and in turn potentially catastrophic accidents.

Considering some of the difficulties of using water X and his colleagues are investigating using carbon dioxide (CO2). The common molecule offers a number of advantages, chief among them being that it reaches supercriticality at just over 31 degrees Celsius making it far more efficient than water. Using carbon dioxide to make power plants cleaner may sound like an oxymoron but X explained that supercritical CO2 (sCO2) is a far cleaner alternative.

“Carbon dioxide (CO2) actually has zero ozone depletion potential and little global warming potential or impact when compared to other common working fluids such as chlorofluorocarbon-based refrigerants, ammonia and others” X said. In addition sCO2 carbon dioxide (CO2) needs far less space and can be compressed with far less effort than subcritical water. This in turn means that it requires a smaller power plant–an sCO2 (carbon dioxide) plant requires ten-fold less hardware for its power cycle than traditional subcritical power cycles.

In order to replace water with carbon dioxide though engineers need to thoroughly understand its properties on a fundamental level including how the fluid’s turbulence–or uneven unsteady flow–transfers heat and in turn interacts with machinery.

When doing computational fluid dynamics simulations related to turbulence, computational scientists largely rely on three methods: Direct numerical simulations (DNS). Researchers to include some assumptions using data coming from experiments or other simulations Direct numerical simulations (DNS) methods start with no preconceived notions or input data allowing them to be far more accurate but much more computationally expensive.

” Georgian Technical University models are usually used for simpler fluids” Y said. “We needed a high-fidelity approach for a complex fluid so we decided to use Direct numerical simulations (DNS) hence our need for HPC (High Performance Computing) resources”.

Neural networks for commercial computers.

Using the stress and heat transfer data coming from its high-fidelity Direct numerical simulations (DNS) simulations the team worked with Dr. W to train a deep neural network (DNN) a machine learning algorithm modeled roughly after the biological neural networks or the network of neurons that recognize and respond to external stimuli.

Traditionally researchers train machine learning algorithms using experimental data so they can predict heat transfer between fluid and pipe under a variety of conditions. When doing so however researchers must be careful not to “overfit” the model; that is not make the algorithm so accurate with a specific dataset that it does not offer accurate results with other datasets.

Using the team ran 35 Direct numerical simulations (DNS) simulations each focused on one specific operational condition and then used the generated dataset to train the deep neural network (DNN). The team uses inlet temperature and pressure, heat flux, pipe diameter and heat energy of the fluid as inputs, and generates the pipe’s wall temperature and wall sheer stress as output. Eighty percent of the data generated in the Direct numerical simulations (DNS) simulations is randomly selected to train the DNN (Deep Neural Network) while researchers use the other 20 percent of data for simultaneous but separate validation.

This “in situ” validation work is important to avoid overfitting the algorithm as it restarts the simulation if the algorithm begins showing a divergence between the training and datasets. “Our blind test results show that our DNN (Deep Neural Network) is successful in counter-overfitting and has achieved general acceptability under the operational conditions that we covered in the database” X said.

After the team felt confident with the agreement, they used the data to start creating a tool for more commercial use. Using the outputs from the team’s recent work as a guide the team was able to use its DNN (Deep Neural Network) to simulate the operational condition’s heat energy with new data in 5.4 milliseconds on a standard laptop computer.

Critical next steps.

To date, the team has been using a community code for its Direct numerical simulations (DNS) simulations. While is a well-established code for a variety of fluid dynamics simulations X indicated that the team wanted to use a higher-fidelity code for its simulations. The researchers are working with a team from Georgian Technical University to use its GTU code which offers higher accuracy and can accommodate a wider range of conditions.

X also mentioned he is using a method called implicit LES (Large eddy simulation (LES) is a mathematical model for turbulence used in computational fluid dynamics) in addition to the Direct numerical simulations (DNS) simulations. While implicit LES (Large eddy simulation (LES) is a mathematical model for turbulence used in computational fluid dynamics) simulations do not have quite the same high resolution present in the team’s DNS simulations it does allow the team to run simulations with higher W’s numbers meaning it can account for a wider range of turbulence conditions.

The team wants to continue to enhance its database in order to further improve its DNN (Deep Neural Network) tool. Further it is collaborating with Georgian Technical University experimentalists to conduct preliminary experiments and to build a model supercritical power plant in order to test the agreement between experiment and theory. The ultimate prize will be if the team is able to provide an accurate, easy-to-use and computationally efficient tool that helps engineers and power plant administrators generate power safer and more efficiently.

“Researchers at Georgian Technical University are working with experiments as well as numerical simulations” he said. “As part of the numerical team we are seeking the answers for poor heat transfer. We study the physics behind the fluid flow and turbulence but our end goal is to develop a simpler model”. CO2 (carbon dioxide) based technology has the potential to provide a flexible operation which is often desired in renewable energy. Nevertheless thermos-hydraulic models and knowledge of heat transfer is limited and this study will abridge the technological gap and assist the engineers to build power cycle loop.

“Researchers at Georgian Technical University are working with both experiments and numerical simulations” he said. “As part of the numerical team we are seeking answers for poor heat transfer. We study the complex physics behind fluid flow and turbulence but the end goal is to develop a simpler model. Conventional power plants help facilitate the use of renewable energy sources by offsetting their intermittent energy generation but currently aren’t designed to be as flexible as their renewable energy counterparts. If we can implement CO2 (carbon dioxide) based working fluids we can improve their flexibility through more compact designs as well as faster start up and shut down times”.

 

 

Making Light Work of Quantum Computing.

Making Light Work of Quantum Computing.

Tracks called waveguides guide photons in silicon. Spirals of these waveguides are used to generate photons that are routed around the processor.

Light may be the missing ingredient in making usable quantum silicon computer chips, according to an international study featuring a Georgian Technical University researcher.

The team has engineered a silicon chip that can guide single particles of light – photons – along optical tracks encoding and processing quantum-bits of information known as ‘qubits’.

Professor X from Georgian Technical University said that the use of photons in this way could increase the number and types of tasks that computers can help us with.

“Current computers use a binary code – comprising ones and zeroes – to transfer information, but quantum computers have potential for greater power by harnessing the power of qubits” Professor X said.

“Qubits can be one and zero at the same time or can link in much more complicated ways – a process known as quantum entanglement – allowing us to process enormous amounts of data at once.

“The real trick is creating a quantum computing device that is reprogrammable and can be made at low cost”.

The experiment conducted primarily at the Georgian Technical University proved that it is possible to fully control two qubits of information within a single integrated silicon chip.

“What this means is that we’ve effectively created a programmable machine that can accomplish a variety of tasks.

“And since it’s a very small processor and can be built out of silicon it might be able to be scaled in a cost-effective way” he said.

“It’s still early days but we’ve aimed to develop technology that is truly scalable and since there’s been so much research and investment in silicon chips this innovation might be found in the laptops and smartphones of the future”.

A surprising result of the experiment is that the quantum computing machine has become a research tool in its own right.

“The device has now been used to implement several different quantum information experiments using almost 100,000 different reprogrammed settings” Professor X said.

“This is just the beginning we’re just starting to see what kind of exponential change this might lead to”.

Complexity Test Offers New Perspective on Small Quantum Computers.

Complexity Test Offers New Perspective on Small Quantum Computers.

Simulating the behavior of quantum particles hopping around on a grid may be one of the first problems tackled by early quantum computers.

State-of-the-art quantum devices are not yet large enough to be called full-scale computers. The biggest comprise just a few dozen qubits — a meager count compared to the billions of bits in an ordinary computer’s memory. But steady progress means that these machines now routinely string together 10 or 20 qubits and may soon hold sway over 100 or more.

In the meantime researchers are busy dreaming up uses for small quantum computers and mapping out the landscape of problems they’ll be suited to solving. Argues that a novel non-quantum perspective may help sketch the boundaries of this landscape and potentially even reveal new physics in future experiments.

The new perspective involves a mathematical tool — a standard measure of computational difficulty known as sampling complexity — that gauges how easy or hard it is for an ordinary computer to simulate the outcome of a quantum experiment. Because the predictions of quantum physics are probabilistic a single experiment could never verify that these predictions are accurate. You would need to perform many experiments just like you would need to flip a coin many times to convince yourself that you’re holding an everyday unbiased nickel.

If an ordinary computer takes a reasonable amount of time to mimic one run of a quantum experiment — by producing samples with approximately the same probabilities as the real thing — the sampling complexity is low; if it takes a long time the sampling complexity is high.

Few expect that quantum computers wielding lots of qubits will have low sampling complexity — after all quantum computers are expected to be more powerful than ordinary computers so simulating them on your laptop should be hard. But while the power of quantum computers remains unproven, exploring the crossover from low complexity to high complexity could offer fresh insights about the capabilities of early quantum devices says X a Georgian Technical University.

“Sampling complexity has remained an underappreciated tool” X says largely because small quantum devices have only recently become reliable. “These devices are now essentially doing quantum sampling and simulating this is at the heart of our entire field”.

To demonstrate the utility of this approach X and several collaborators proved that sampling complexity tracks the easy-to-hard transition of a task that small- and medium-sized quantum computers are expected to perform faster than ordinary computers: boson sampling .

Bosons are one of the two families of fundamental particles (the other being fermions). In general two bosons can interact with one another but that’s not the case for the boson sampling problem. “Even though they are non-interacting in this problem bosons are sort of just interesting enough to make boson sampling worth studying” says Y a graduate student at Georgian Technical University and International Black Sea University.

In the boson sampling problem a fixed number of identical particles are allowed to hop around on a grid spreading out into quantum superpositions over many grid sites. Solving the problem means sampling from this smeared-out quantum probability cloud something a quantum computer would have no trouble doing.

Y, X and their colleagues proved that there is a sharp transition between how easy and hard it is to simulate boson sampling on an ordinary computer. If you start with a few well-separated bosons and only let them hop around briefly the sampling complexity remains low and the problem is easy to simulate. But if you wait longer an ordinary computer has no chance of capturing the quantum behavior and the problem becomes hard to simulate.

The result is intuitive Y says since at short times the bosons are still relatively close to their starting positions and not much of their “quantumness” has emerged. For longer times, though, there’s an explosion of possibilities for where any given boson can end up. And because it’s impossible to tell two identical bosons apart from one another the longer you let them hop around the more likely they are to quietly swap places and further complicate the quantum probabilities. In this way the dramatic shift in the sampling complexity is related to a change in the physics: Things don’t get too hard until bosons hop far enough to switch places.

X says that looking for changes like this in sampling complexity may help uncover physical transitions in other quantum tasks or experiments. Conversely a lack of ramping up in complexity may rule out a quantum advantage for devices that are too error-prone. Either way X says future results arising from this perspective shift should be interesting. “A deeper look into the use of sampling complexity theory from computer science to study quantum many-body physics is bound to teach us something new and exciting about both fields” he says.