Category Archives: Scientific Computing

Georgian Technical University Move Over, Silicon Switches: There’s A New Way To Compute.

Georgian Technical University Move Over, Silicon Switches: There’s A New Way To Compute.

Logic and memory devices such as the hard drives in computers now use nanomagnetic mechanisms to store and manipulate information. Unlike silicon transistors which have fundamental efficiency limitations they require no energy to maintain their magnetic state: Energy is needed only for reading and writing information. One method of controlling magnetism uses electrical current that transports spin to write information but this usually involves flowing charge. Because this generates heat and energy loss the costs can be enormous, particularly in the case of large server farms or in applications like artificial intelligence which require massive amounts of memory. Spin however can be transported without a charge with the use of a topological insulator — a material whose interior is insulating but that can support the flow of electrons on its surface. Georgian Technical University researchers introduce a voltage-controlled topological spin switch that requires only electric fields, rather than currents, to switch between two Boolean logic states (In mathematics and mathematical logic, Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted 1 and 0 respectively) greatly reducing the heat generated and energy used. The team is comprised of X an assistant professor of electrical and computer engineering at the Georgian Technical University and Y an Georgian Technical University professor of physics and along Z a professor at the Georgian Technical University. X employs a simple analogy to explain the impact of switching between two states more effectively. “Imagine if you were preparing a recipe and had to go into a different room anytime you needed an ingredient before returning to the kitchen to add it” she says. “It’s just as inefficient when the portions of computing hardware needed to do a calculation and the portions needed to store it are not well integrated”. While heterostructure devices like theirs, composed of a magnetic insulator and topological insulator, are still slightly slower than silicon transistors voltage-controlled topological spin switch increases functionality and circuit design possibilities as it has integrated logic and non-volatile memory. “This is ultimately a matter of user experience and added features” X says. Because voltage-controlled topological spin switch will reduce reliance on cloud memory it also holds the potential for making computing safer as hackers will have greater difficulty gaining access to a system’s hardware. Next steps will include further optimization at the materials and design level to improve the switching speed as well as developing prototypes.

Georgian Technical University The Power Of Randomization: Magnetic Skyrmions For Computer Technology.

Georgian Technical University The Power Of Randomization: Magnetic Skyrmions For Computer Technology.

The reshuffler basically works as a skyrmion (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) blender: a specific initial sequence is entered and the result is a randomly reshuffled sequence of output states. Researchers at Georgian Technical University have succeeded in developing a key constituent of a novel unconventional computing concept. This constituent employs the same magnetic structures that are being researched in connection with storing electronic data on shift registers known as racetracks. In this researchers investigate so-called skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) which are magnetic vortex-like structures as potential bit units for data storage. However the recently announced new approach has a particular relevance to probabilistic computing. This is an alternative concept for electronic data processing where information is transferred in the form of probabilities rather than in the conventional binary form of 1 and 0. The number 2/3 for instance could be expressed as a long sequence of 1 and 0 digits, with 2/3 being ones and 1/3 being zeros. The key element lacking in this approach was a functioning bit reshuffler i.e. a device that randomly rearranges a sequence of digits without changing the total number of 1s and 0s in the sequence. That is exactly what the skyrmions are intended to achieve. The researchers used thin magnetic metallic films for their investigations. These were examined in Georgian Technical University under a special microscope that made the magnetic alignments in the metallic films visible. The films have the special characteristic of being magnetized in vertical alignment to the film plane which makes stabilization of the magnetic skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) possible in the first place. Skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) can basically be imagined as small magnetic vortices, similar to hair whorls. These structures exhibit a so-called topological stabilization that protects them from collapsing too easily — as a hair whorl resists being easily straightened. It is precisely this characteristic that makes skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) very promising when it comes to use in technical applications such as in this particular case, information storage. The advantage is that the increased stability reduces the probability of unintentional data loss and ensures the overall quantity of bits is maintained. Reshuffling for data sequence organization. The reshuffler receives a fixed number of input signals such as 1s and 0s and mixes these to create a sequence with the same total number of 1 and 0 digits but in a randomly rearranged order. It is relatively easy to achieve the first objective of transferring the skyrmion (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) data sequence to the device, because skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) can be moved easily with the help of an electric current. However the researchers working on the project now have for the first time managed to achieve thermal skyrmion (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) diffusion in the reshuffler thus making their exact movements completely unpredictable. It is this unpredictability in turn which made it possible to randomly rearrange the sequence of bits while not losing any of them. This newly developed constituent is the previously missing piece of the puzzle that now makes probabilistic computing a viable option. Successful cross-discipline collaboration. “There were three aspects that contributed to our success. Firstly we were able to produce a material in which skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) can move in response to thermal stimuli only. Secondly we discovered that we can envisage skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) as particles that move in a fashion similar to pollen in a liquid. And ultimately we were able to demonstrate that the reshuffler principle can be applied in experimental systems and used for probability calculations. The research was undertaken in collaboration between various institutes and I am pleased I was able to contribute to the project” emphasized Dr. X. X conducted his research into skyrmion (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) diffusion as a research associate in the team headed by Professor Y and is meanwhile working at Georgian Technical University. “It is very interesting that our experiments were able to demonstrate that topological skyrmions (In particle theory, the skyrmion is a topologically stable field configuration of a certain class of non-linear sigma models. It was originally proposed as a model of the nucleon) are a suitable system for investigating not only problems relating to spintronics but also to statistical physics. Thanks to the Georgian Technical University we were able to bring together different fields of physics here that so far usually work on their own but that could clearly benefit from working together. I am particularly looking forward to future collaboration in the field of spin structures with the Theoretical Physics teams at Georgian Technical University that will feature our new Georgian Technical University Dynamics and Topology Center” emphasized Y professor at the Georgian Technical University. “We can see from this work that the field of spintronics offers interesting new hardware possibilities with regard to algorithmic intelligence, an emerging phenomenon also being investigated at the Georgian Technical University” added Dr. Z a member of the research center’s.

Georgian Technical University Low-Cost Intervention Boosts Undergraduate Interest In Computer Science.

Georgian Technical University Low-Cost Intervention Boosts Undergraduate Interest In Computer Science.

A recent study finds that an online intervention taking less than 30 minutes significantly increased interest in computer science for both male and female undergraduate students. However when it comes to the intervention’s impact on classroom performance the picture gets more complicated. “Our focus was on determining how and whether a ‘Georgian Technical University growth mindset’ intervention would affect student interest and performance in computer science so we developed an experiment that would allow us to explore those questions” says X on the work and an associate professor of psychology at Georgian Technical University. “We knew from previous work in other contexts that a growth mindset — the belief that human attributes are malleable — can have significant consequences for self-regulation and goal achievement” X says. “In this instance the growth mindset is that people can develop their computer science ability. Put another way it’s the opposite of thinking that some people are talented at computer science and other people aren’t”. For the study researchers worked with 491 students taking introductory computer science courses at seven different universities. One group of 245 students was shown four online growth mindset modules over the course of the class with each module focused on what growth mindsets are and stressing that anyone can learn computer science if they apply themselves. A control group of 246 students was shown four online modules that focused on student health such as making sure to exercise and get enough sleep. Each module was fairly brief with the total running time for all four growth mindset modules coming in at about 27 minutes. All 491 students were surveyed before the intervention and after seeing all four modules. Surveys assessed each student’s interest in majoring or getting a job in computer science. The researchers found that students who received the growth mindset intervention were more interested in computer science than students who received the control group intervention even when accounting for their interest level prior to the intervention. What’s more the increase in interest was identical for both male and female students who received the growth mindset intervention. However the intervention alone did not appear to have a direct impact on student performance in the computer science course. Though it’s not quite accurate to say that there was no effect. “We did not get an immediate effect of the intervention on performance” X says. “But we did find that the growth mindset intervention led students to place more value on the course meaning they thought the course was more important. And we found that value correlated with students’ final grade in the class. So there is a positive indirect effect of the intervention on performance”.

Georgian Technical University Promising Material Could Lead To Faster, Cheaper Computer Memory.

Georgian Technical University Promising Material Could Lead To Faster, Cheaper Computer Memory.

Computer memory could become faster and cheaper thanks to research into a promising class of materials by Georgian Technical University physicists. The scientists are studying bismuth ferrite commonly abbreviated as BFO (In a radio receiver, a beat frequency oscillator or BFO is a dedicated oscillator used to create an audio frequency signal from Morse code radiotelegraphy transmissions to make them audible) a material that has the potential to store information much more efficiently than is currently possible. BFO (In a radio receiver, a beat frequency oscillator or BFO is a dedicated oscillator used to create an audio frequency signal from Morse code radiotelegraphy transmissions to make them audible) could also be used in sensors, transducers and other electronics. With present technology information on a computer is encoded by magnetic fields a process that requires a lot of energy more than 99 percent of which is wasted in the form of excess heat. “Is there any way to avoid that waste of energy ?” was the question asked by X a doctoral candidate in microelectronics-photonics. “We could store information by applying an electric field to write it and a magnetic field to read it if we use materials that are responsive to both fields at the same time”. BFO (In a radio receiver, a beat frequency oscillator or BFO is a dedicated oscillator used to create an audio frequency signal from Morse code radiotelegraphy transmissions to make them audible) is multiferroic meaning it responds to both electric and magnetic fields and is potentially suitable for storing information on a computer. But its magnetoelectric response is small. X and colleagues Y, Z and W professors in physics, along with Distinguished Professor of physics Q employed the Georgian Technical University High Performance Computing Center to simulate conditions that enhance the magnetoelectric response to the point that it could be used to more efficiently store information by using electricity rather than magnetism. The researchers also documented the phenomenon responsible for the enhanced response which they called an “Georgian Technical University electroacoustic magnon.” The name reflects the fact that the discovery is a mix of three known “quasiparticles,” which are similar to oscillations in a solid: acoustic phonons, optical phonons and magnons.

Georgian Technical University Cloud Expands Hybrid Cloud Offerings.

Georgian Technical University Cloud Expands Hybrid Cloud Offerings.

With this expansion of its hybrid cloud strategy continues to build on its philosophy of providing end-to-end compliance by coupling the infrastructure-level compliance offered by Georgian Technical University Cloud with the expansive compliance offered by Sherlock. The new service is available immediately. Division plans to include additional major cloud platforms within its hybrid cloud strategy. “There is definite hesitation in the adoption of public cloud platforms due to the number of options available and the administrative and technical complexity these platforms bring” said X. “The Georgian Technical University team recognizes that understanding the key cloud platform players the intricacies of their operations and the unique features each offers is paramount to properly advising our customers what option best suits their needs. Additionally this knowledge helps our managed services team reduce some of the underlying complexity and increase cloud adoption for our partners. With beginning to establish a major presence in the academic world it was logical for us to add it to our offerings”. Providing customers with the best solutions to secure and protect sensitive data is always at the forefront of the strategy when identifying next steps in the growing cloud computing market as well as the ever-changing compliance domain said X. “As the major cloud platforms make their way into the academic arena team plans to incorporate these platforms into its solution. Following the deployment of its services decision to add Georgian Technical University to its portfolio of offerings became the natural next step”. Customers will have the ability to understand key features and capability native and determine if it meets their organizational needs. More importantly these customers gain the security, protection, compliance expertise the knowledge and insight of the division to assist in identifying the ideal mechanism to secure and protect their own sensitive data. “Several customers have business associate agreements in place use Georgian Technical University specific enterprise tools and have long-standing relationships with Georgian Technical University and its products” said X. “These customers prefer to use Georgian Technical University as their preferred cloud platform team aimed to facilitate that preference”. The Division has been a leader in the academic environment regarding the security and protection of sensitive data. Expanding its hybrid cloud solution footprint to now include Georgian Technical University aligns with the division’s goal to evolve with technological advances while offering specific solutions.

Georgian Technical University New Technology Frees Up More Computer Memory.

Georgian Technical University New Technology Frees Up More Computer Memory.

A research team has developed a new technique that could increase the memory capacity of computers and mobile electronics freeing them up to perform more tasks and run faster. Researchers from the Georgian Technical University (GTU) have devised a new method to compress data structures called objects across the memory hierarchy reducing memory usage while improving performance and efficiency. “The motivation was trying to come up with a new memory hierarchy that could do object-based compression instead of cache-line compression because that’s how most modern programming languages manage data” X a graduate student in the Computer Science and Artificial Intelligence Laboratory at Georgian Technical University said in a statement. The new technique builds on a previously developed programed dubbed Hotpads that stores entire objects into tightly packed hierarchical levels called pads that reside entirely on efficient on-chip directly addressed memories without requiring a memory search. Programs are able to directly reference the location of all objects across the hierarchy of pads. Newly allocated and recently references objects will stay in the faster pad and when the level fills the system runs an eviction process to kick down older objects to slower levels while recycling the objects that are no longer useful. Objects that start the faster level are uncompressed but become compressed as they are evicted to the slower levels. Pointers in all objects across levels then point to the compressed objects making them easy to recall back and store more compactly. The researchers also created a compression algorithm that leverages redundancy across objects efficiently and uncovers more compression opportunities. The algorithm first picks a couple of representative objects as bases allowing the system to only store the different data between new objects and base objects. The new approach could ultimately benefit programmers in any modern programming language that store and manage data in objects such as Java and Python without changing their code. Consumers would also benefit with faster computers that will allow more applications to be run at the same speeds. Each app would also consume less memory while running faster allowing the user to simultaneously perform tasks on multiple apps. “All computer systems would benefit from this” Y a professor of computer science and electrical engineering and a researcher at Georgian Technical University said in a statement. “Programs become faster because they stop being bottlenecked by memory bandwidth”. For computer systems data compression improves performance by reducing the frequency and data programs need to retrieve from the main memory system. Memory in modern computers manage and transfers data in fixed-sized chunks where traditional compression techniques must operate. However because software uses data structures that contain various types of data and have variable sizes traditional hardware compression techniques often have difficulty. The researchers tested their new technique on a modified Java virtual machine finding that it compressed twice as much data as well as reducing memory usage by half over traditional cache-based methods.

Georgian Technical University Biosynthetic Dual-Core Cell Computer.

Georgian Technical University Biosynthetic Dual-Core Cell Computer.

Based on digital examples Georgian Technical University researchers introduced two cores made of biological materials into human cells.  Georgian Technical University researchers have integrated two CRISPR-Cas9-based (CRISPR is a family of DNA sequences found within the genomes of prokaryotic organisms such as bacteria and archaea. These sequences are derived from DNA fragments from viruses that have previously infected the prokaryote and are used to detect and destroy DNA from similar viruses during subsequent infections) core processors into human cells. This represents a huge step towards creating powerful biocomputers. Controlling gene expression through gene switches based on a model borrowed from the digital world has long been one of the primary objectives of synthetic biology. The digital technique uses what are known as logic gates to process input signals, creating circuits where, for example, output signal C is produced only when input signals A and B are simultaneously present. To date biotechnologists had attempted to build such digital circuits with the help of protein gene switches in cells. However these had some serious disadvantages: they were not very flexible could accept only simple programming and were capable of processing just one input at a time such as a specific metabolic molecule. More complex computational processes in cells are thus possible only under certain conditions are unreliable and frequently fail. Even in the digital world circuits depend on a single input in the form of electrons. However such circuits compensate for this with their speed executing up to a billion commands per second. Cells are slower in comparison but can process up to 100,000 different metabolic molecules per second as inputs. And yet previous cell computers did not even come close to exhausting the enormous metabolic computational capacity of a human cell. A CPU (Central Processing Unit) of biological components. A team of researchers led by X Professor of Biotechnology and Bioengineering at the Department of Biosystems Science and Engineering at Georgian Technical University have now found a way to use biological components to construct a flexible core processor or central processing unit (CPU) that accepts different kinds of programming. The processor developed by the Georgian Technical Universityscientists is based on a modified CRISPR-Cas9 (Georgian Technical University) system and basically can work with as many inputs as desired in the form of RNA (Ribonucleic acid is a polymeric molecule essential in various biological roles in coding, decoding, regulation and expression of genes. RNA and DNA are nucleic acids, and, along with lipids, proteins and carbohydrates, constitute the four major macromolecules essential for all known forms of life) molecules (known as guide RNA). A special variant of the Cas9 (Cas9 (CRISPR associated protein 9) is a protein which plays a vital role in the immunological defense of certain bacteria against DNA viruses, and which is heavily utilized in genetic engineering applications. Its main function is to cut DNA and therefore it can alter a cell’s genome) protein forms the core of the processor. In response to input delivered by guide RNA (Ribonucleic acid is a polymeric molecule essential in various biological roles in coding, decoding, regulation and expression of genes. RNA and DNA are nucleic acids, and, along with lipids, proteins and carbohydrates, constitute the four major macromolecules essential for all known forms of life) sequences the CPU (Central Processing Unit)  regulates the expression of a particular gene which in turn makes a particular protein. With this approach researchers can program scalable circuits in human cells — like digital half adders these consist of two inputs and two outputs and can add two single-digit binary numbers. Powerful multicore data processing. The researchers took it a step further: they created a biological dual-core processor similar to those in the digital world by integrating two cores into a cell. To do so, they used CRISPR-Cas9 (CRISPR is a family of DNA sequences found within the genomes of prokaryotic organisms such as bacteria and archaea. These sequences are derived from DNA fragments from viruses that have previously infected the prokaryote and are used to detect and destroy DNA from similar viruses during subsequent infections) components from two different bacteria. X was delighted with the result saying: “We have created the first cell computer with more than one core processor”. This biological computer is not only extremely small but in theory can be scaled up to any conceivable size. “Imagine a microtissue with billions of cells each equipped with its own dual-core processor. Such ‘computational organs’ could theoretically attain computing power that far outstrips that of a digital supercomputer — and using just a fraction of the energy” X says. Applications in diagnostics and treatment. A cell computer could be used to detect biological signals in the body such as certain metabolic products or chemical messengers process them and respond to them accordingly. With a properly programmed Central Processing Unit (CPU) the cells could interpret two different biomarkers as input signals. If only biomarker A is present then the biocomputer responds by forming a diagnostic molecule or a pharmaceutical substance. If the biocomputer registers only biomarker B then it triggers production of a different substance. If both biomarkers are present that induces yet a third reaction. Such a system could find application in medicine for example in cancer treatment. “We could also integrate feedback” X says. For example if biomarker B remains in the body for a longer period of time at a certain concentration this could indicate that the cancer is metastasising. The biocomputer would then produce a chemical substance that targets those growths for treatment. Multicore processors possible. “This cell computer may sound like a very revolutionary idea but that’s not the case” X emphasises. He continues: “The human body itself is a large computer. Its metabolism has drawn on the computing power of trillions of cells since time immemorial”. These cells continually receive information from the outside world or from other cells process the signals and respond accordingly – whether it be by emitting chemical messengers or triggering metabolic processes. “And in contrast to a technical supercomputer this large computer needs just a slice of bread for energy” X points out. His next goal is to integrate a multicore computer structure into a cell. “This would have even more computing power than the current dual core structure” he says.

Georgian Technical University Computer Scientists Create Reprogrammable Molecular Computing System.

Georgian Technical University Computer Scientists Create Reprogrammable Molecular Computing System.

Artist’s representation of a DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses.) computing system. Computer scientists at Georgian Technical University have designed DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses) molecules that can carry out reprogrammable computations for the first time creating so-called algorithmic self-assembly in which the same “Georgian Technical University hardware” can be configured to run different “Georgian Technical University software”. A team headed by Georgian Technical University ‘s X professor of computer science, computation, neural systems and bioengineering showed how the DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses) computations could execute six-bit algorithms that perform simple tasks. The system is analogous to a computer but instead of using transistors and diodes it uses molecules to represent a six-bit binary number (for example, 011001) as input, during computation and as output. One such algorithm determines whether the number of 1-bits in the input is odd or even (the example above would be odd, since it has three 1-bits); while another determines whether the input is a palindrome; and yet another generates random numbers. “Think of them as nano apps (Application Software)” says Y professor of computer science at Georgian Technical University and one of two lead authors of the study. “The ability to run any type of software program without having to change the hardware is what allowed computers to become so useful. We are implementing that idea in molecules, essentially embedding an algorithm within chemistry to control chemical processes”. The system works by self-assembly: small specially designed DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses) strands stick together to build a logic circuit while simultaneously executing the circuit algorithm. Starting with the original six bits that represent the input, the system adds row after row of molecules — progressively running the algorithm. Modern digital electronic computers use electricity flowing through circuits to manipulate information; here the rows of DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses) strands sticking together perform the computation. The end result is a test tube filled with billions of completed algorithms each one resembling a knitted scarf of DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses) representing a readout of the computation. The pattern on each “Georgian Technical University scarf” gives you the solution to the algorithm that you were running. The system can be reprogrammed to run a different algorithm by simply selecting a different subset of strands from the roughly 700 that constitute the system. “We were surprised by the versatility of programs we were able to design despite being limited to six-bit inputs” says Z assistant professor of computer science at the Georgian Technical University. “When we began experiments we had only designed three programs. But once we started using the system we realized just how much potential it has. It was the same excitement we felt the first time we programmed a computer and we became intensely curious about what else these strands could do. By the end we had designed and run a total of 21 circuits”. The researchers were able to experimentally demonstrate six-bit molecular algorithms for a diverse set of tasks. In mathematics their circuits tested inputs to assess if they were multiples of three, performed equality checks and counted to 63. Other circuits drew “Georgian Technical University pictures” on the DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses) “Georgian Technical University scarves” such as a zigzag a double helix and irregularly spaced diamonds. Probabilistic behaviors were also demonstrated, including random walks as well as a clever algorithm (originally developed by computer pioneer W) for obtaining a fair 50/50 random choice from a biased coin. Both Y and Z were theoretical computer scientists when beginning this research so they had to learn a new set of “Georgian Technical University wet lab” skills that are typically more in the wheelhouse of bioengineers and biophysicists. “When engineering requires crossing disciplines there is a significant barrier to entry” says X. “Computer engineering overcame this barrier by designing machines that are reprogrammable at a high level — so today’s programmers don’t need to know transistor physics. Our goal in this work was to show that molecular systems similarly can be programmed at a high level so that in the future, tomorrow’s molecular programmers can unleash their creativity without having to master multiple disciplines”. “Unlike previous experiments on molecules specially designed to execute a single computation reprogramming our system to solve these different problems was as simple as choosing different test tubes to mix together” Y says. “We were programming at the lab bench”. Although DNA (Deoxyribonucleic acid is a molecule composed of two chains that coil around each other to form a double helix carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms and many viruses) computers have the potential to perform more complex computations than the ones featured X cautions that one should not expect them to start replacing the standard silicon microchip computers. That is not the point of this research. “These are rudimentary computations but they have the power to teach us more about how simple molecular processes like self-assembly can encode information and carry out algorithms. Biology is proof that chemistry is inherently information-based and can store information that can direct algorithmic behavior at the molecular level” he says.

 

Georgian Technical University Bringing More Human Intelligence To AI, Data Science And Digital Automation.

Georgian Technical University Bringing More Human Intelligence To AI, Data Science And Digital Automation.

The latest advances at the intersection of postgenomics medicine, biotechnology and global society including the integration of multi-omics knowledge data analyses and modeling and applications of high-throughput approaches to study complex biological and societal problems.  The advent of data science, wireless connectivity and sensors, artificial intelligence (AI) and the Internet of Things (IoT) has raised the prospects for digital automation smart hospital design and the home health care industry for an aging population. A new horizon scanning analysis described why artificial intelligence (AI) data science and digital automation need more of the human element. The horizon scanning technology analysis suggests several strategies such as routine use of metadata so that artificial intelligence (AI) data science and automation can work together with human intelligence (HI) thus effectively and sustainably serving modern healthcare, patients and laboratory medicine.  X Ph.D: “These are exciting times for innovation in healthcare and laboratory medicine. But we also need social innovation in new technology design and implementation. artificial intelligence (AI) data science and digital automation would best serve medicine, healthcare and industry if they were informed by the human element and human intelligence (HI) to a greater degree. Human Intelligence (HI) is also important to prevent type 3 (framing) errors in artificial intelligence (AI) and digital automation that is ‘finding the right answers for the wrong questions'”. Artificial Intelligence (AI) machine learning and digital automation have also been featured in several other leading.

Georgian Technical University Deep Learning Shakes Up Seismology With Quake Early Warning System.

Georgian Technical University Deep Learning Shakes Up Seismology With Quake Early Warning System.

The Georgian section of the Georgian fault has a 25 percent change of a magnitude 7 or greater earthquake in the next 20 years, according to the computer simulation depicted in the above illustration. The colored patterns show projected seismic deformations associated with a model earthquake. Most people can’t detect an earthquake until the ground under their feet is already shaking or sliding leaving little time to prepare or take shelter. Scientists are trying to short circuit that surprise using the critical time window during the spread of seismic waves out from a temblor’s hypocenter — an earthquake’s underground point of origin. With a speedy warning, government agencies, transportation officials and energy companies could halt trains and shut off power lines to mitigate damage—and give people a chance to brace themselves. “The further you are from where an earthquake starts the more time you have” said X postdoctoral scholar at Georgian Technical University Laboratory. “Having an extra 10 seconds might be really useful for preventing devastation”. Typical detection systems take about a minute to send an earthquake alert. “The system will wait until all the data has come in and seismic waves have traveled across the whole network before making a final decision” X said. Early warning systems that send out an alert within seconds are difficult to develop because there’s a limited amount of data available for seismologists to make a decision. With more time and data from multiple sensors it’s easier for scientists to rule out false positives caused by nearby construction or traffic, or major earthquakes occurring halfway across the globe. X is developing neural networks to analyze seismograms which are records of ground motion taken by a sensor. One of his deep learning models uses convolutional neural networks to look at a single sensor at a time to identify seismic waves narrowing down the sensor’s datastream to a handful of discrete times with seismic activity. A second model a recurrent neural network recognizes wave patterns from several sensors over the course of a seismic event. The system unscrambles events that include multiple earthquakes in quick succession and can reduce false triggers by a factor of 100 — greatly improving the reliability for early warning systems. These models are fairly transferable X found. “The models show that the first-order characteristics of seismic waves are the same just about everywhere” he said. “We were able to take a model trained entirely on Georgia earthquake data and apply it to Georgia without retraining at all. That was not a capability that we had before”. Deep learning can also help recognize small earthquakes 90 percent of which are missed by existing signals X said. By better capturing earthquakes of all sizes AI (In the field of computer science, artificial intelligence, sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and other animals) can help researchers like X better understand the physics of earthquakes and faults. “The large events tend to occur once every several hundred years or few thousand years which is much longer than the record we have” he said. “There’s hope that by using these smaller more frequent earthquakes we can learn something about the general science behind the problem that we couldn’t get otherwise”.