ACM TechNews

RSS

Timely Topics for IT Professionals

ACM TechNews Home

Recent News

Chinese Scientists Make New Breakthrough in Quantum Communication

Xinhua

Researchers at the University of Science and Technology of China have demonstrated long-distance free-space quantum key distribution during daylight, a breakthrough they say proves the feasibility of satellite-based quantum communication in daylight and establishes the foundation for a satellite-constellation-based global quantum network. The researchers, led by the Chinese Academy of Sciences' Pan Jianwei, used a wavelength of 1,550 nanometers and developed a free-space, single-mode fiber-coupling technology and ultra-low-noise upconversion single-photon detectors to achieve the daytime distribution across a distance of 53 kilometers (33 miles). Pan notes quantum communication is ultra-secure because a quantum photon can neither be separated nor duplicated, meaning it is impossible to wiretap, intercept, or crack the information it transmits. The research project is part of a broader effort by China to deploy the first-ever global quantum communication network by 2030, which would link a satellite constellation consisting of dozens of quantum satellites and ground-based quantum communication networks.

From "Chinese Scientists Make New Breakthrough in Quantum Communication"
Xinhua (07/25/17)
View Full Article

U.S. Looks to Quantum Science to Beat China in Supercomputing Race

CIO Journal

The U.S. government plans to beat China in the development of an exascale supercomputer system by investing in quantum computing, partly through $258 million in Department of Energy grants allocated over three years. Oak Ridge National Laboratory's Jeff Nichols thinks quantum and neuromorphic computing could become integral accelerator technologies. "These types of accelerators are going to give us the opportunity to be leaders in the field of supercomputers," Nichols says. He notes quantum processors inside an exascale system could help address problems concerning astrophysics, nuclear energy, cybersecurity, and computational climate simulation. Oak Ridge researchers have been conducting cloud-based experiments with a D-Wave Systems quantum computer, and Nichols says protecting the energy grid from hackers is one potential application of quantum computing. The U.S. government says it plans to have at least one exascale supercomputing system in operation by 2021, while China has said it is planning to have its own system up and running by 2020.

From "U.S. Looks to Quantum Science to Beat China in Supercomputing Race"
CIO Journal (07/25/17) Sara Castellanos
View Full Article - May Require Paid Subscription

Chaos Theory Strengthens Digital Locks

Asian Scientist

Researchers at Kyoto University in Japan have definitively demonstrated the strength of a 128-bit digital lock for cybersecurity applications. The researchers released new data on the Vector Stream Cipher (VSC), which represents the first example of a 128-bit key chaotic cipher with provable security. "Many theoretical attacks in the past have failed to break it, but until now we hadn't shown definitive proof of security," says Kyoto University professor Ken Umeno. The researchers conducted several tests, including a method to evaluate the lock's randomness. The team says the enhanced VSC is not only secure, but also structurally simple, so it requires low memory usage compared to existing technology, which makes it useful for high-density data transmission applications such as 5G mobile networks and 4K television broadcasts. "Chaotic ciphers have been in use for about 30 years, but before this study we had not expected to find proof of security," Umeno says.

From "Chaos Theory Strengthens Digital Locks"
Asian Scientist (07/25/17)
View Full Article

AI-Generated Map Predicts Who Will Die Next in 'Game of Thrones'

Inverse

Researchers at Central European University in Hungary have developed a machine-learning algorithm that can predict which characters on "Game of Thrones" will die next. The researchers quantified each character's social significance, and then entered those figures into the algorithm. The team built a network of the fictional world's social system based on how often characters interacted with each other by pulling data from the show's subtitles, providing almost 600 scenes' worth of information about how often characters appeared in the same scene. The data illustrated which characters had the strongest ties to other characters. After calculating the most important characters, the researchers related "network position to survival." Of the 94 characters considered in the study, 61 had already died, providing information about their social position and level of importance. The researchers found the model accurately predicted the fates of about 75 percent of the characters that were analyzed.

From "AI-Generated Map Predicts Who Will Die Next in 'Game of Thrones'"
Inverse (07/25/17) Mark Kaufman
View Full Article

UCLA Research Offers Clearest Evidence of Long-Sought Majorana Particle

UCLA Newsroom

A research team at the University of California, Los Angeles (UCLA) has uncovered evidence for the existence of the Majorana particle, which could serve as a platform for topological quantum computers. The researchers positioned a superconductor above a thin film insulator so engineers could manipulate the particles into patterns. Exposing this construct to a small magnetic field yielded the Majorana particles' unique quantized signature--a braid-like pattern--in the external traffic between the two materials. "We observed quantum behavior, and the signal we saw clearly showed the existence of these particles," says UCLA's Qing Lin He. The Majorana particle carries no electric charge, which optimizes it to carry a qubit, insulates it from external interference, and enables it to leverage and sustain quantum entanglement. The researchers say they will now start exploring the application of Majorana particles in quantum braiding, which would weave them together so information can be stored and processed at super-high speeds.

From "UCLA Research Offers Clearest Evidence of Long-Sought Majorana Particle"
UCLA Newsroom (07/24/17) Matthew Chin
View Full Article

NSF Project Sets Up First Machine Learning Cyberinfrastructure

HPCwire

The U.S. National Science Foundation (NSF) has granted $1 million to Larry Smarr's group at the California Institute for Telecommunications and Information Technology to develop the Cognitive Hardware and Software Ecosystem, Community Infrastructure (CHASE-CI). CHACE-CI will leverage the high-speed Pacific Research Platform (PRP) and make fast graphical-processing unit (GPU) appliances available to researchers to solve machine-learning hardware, software, and architecture challenges. NSF says CHASE-CI "will build a cloud of hundreds of affordable [GPUs], networked together with a variety of neural network machines to facilitate development of next-generation cognitive computing." Playing a crucial role in the infrastructure will be Flexible I/O Network Appliances (FIONAs), or PC-based termination devices. Smarr's goal is to integrate FIONAs with field-programmable gate arrays and plug them into the PRP to support machine learning. Meeting CHASE-CI's software demands will entail the use of deep neural network algorithms, reinforcement learning, and support vector machines.

From "NSF Project Sets Up First Machine Learning Cyberinfrastructure"
HPCwire (07/25/17) John Russell
View Full Article

Microsoft Teams Up With Major Universities to Build the First Scalable Quantum Computer

International Business Times

Purdue University, Delft University of Technology in the Netherlands, Sydney University in Australia, and the University of Copenhagen in Denmark have partnered with Microsoft to help build the first scalable quantum computer. The focus of the alliance will be on topological qubits, and will depend on as-yet-unproven non-abelian anyons. A quantum system consisting of topological qubits would have data encoded in various states by braiding paths of quasi-particles. A universal set of computational gates would perform computations by braiding quasi-particles, and then quantifying the multi-quasi-particle states. Topological qubits are more decoherence-resistant than other kinds of qubits, so error correction would be less necessary. Microsoft currently has a global quantum computing lab network called Station Q, and its newest branch will be based at the University of Sydney's Nanoscience Hub. "We're investing big to get a scalable quantum computer," says David Pritchard at Microsoft's Artificial Intelligence and Research Group.

From "Microsoft Teams Up With Major Universities to Build the First Scalable Quantum Computer"
International Business Times (07/25/17) Mary-Ann Russon
View Full Article

Meet UrduScript, the First Urdu-Based Programming Language

TechJuice

Pakistani computer scientist Asad Memon has developed UrduScript, a new programing language that makes it easier for beginners to learn programming concepts such as variables, for loop, if else, recursion, while loop, functions, and lists. Memon says code that is written in UrduScript, which is essentially JavaScript with some alterations that are based on the Pakistani language of Urdish, will first be transformed into JavaScript and then display the result to the user, a process known as transpiling. Transpiling makes it easier for programmers to create code because it enables them to work in a language with which they are familiar, before changing it into JavaScript. "The main idea behind it is to facilitate young kids or people who are new to programming," Memon says. He also notes he chose Urdish over the Persian-like Urdu because a right-to-left coding style based on Urdu would have been very difficult to read.

From "Meet UrduScript, the First Urdu-Based Programming Language"
TechJuice (07/24/17) Maryam Dodhy
View Full Article

Breakthrough in Spin Wave-Based Information Processing Technology

NUS News (Singapore)

Researchers at the National University of Singapore (NUS) have successfully developed a technique for the simultaneous propagation of spin wave signals in multiple directions at the same frequency, without the need for any external magnetic field. The researchers say their breakthrough permits ultra-low power operations, making it suitable for device integration as well as energy-efficient operation at room temperature. The approach relies on a novel structure comprising different layers of magnetic materials to generate spin wave signals. The discovery builds on an early study by the NUS team in which it developed a device that could transmit and manipulate spin wave signals without the need for any external magnetic field or current. "Collectively, both discoveries would make possible the on-demand control of spin waves, as well as the local manipulation of information and reprogramming of magnetic circuits, thus enabling the implementation of spin wave-based computing and coherent processing of data," says NUS professor Adekunle Adeyeye.

From "Breakthrough in Spin Wave-Based Information Processing Technology"
NUS News (Singapore) (07/24/17)
View Full Article

CPU Architecture After Moore's Law: What's Next?

Computerworld

Views differ on how central-processing unit (CPU) architecture will proceed as Moore's Law becomes increasingly irrelevant, with some experts expecting a massive surge in processor performance while others anticipate more gradual, incremental improvements. "For general-purpose applications, we have run out of ideas for making them faster," says University of California, Berkeley professor David Patterson. "The path forward is domain-specific architecture." Patterson thinks the way ahead will be the addition of highly specialized processors that outperform standard microprocessors. However, such processors run specialized software requiring their own tools and compilers, and Berkeley professor Krste Asanovic says CPU proliferation is bad in terms of software complexity, but unavoidable. Some experts believe quantum computing is the best way to accelerate processing speed, while others say the rapid traction of the processor market is inhibiting advances. Open source processor hardware such as RISC-V is envisioned as a means for more affordable innovation.

From "CPU Architecture After Moore's Law: What's Next?"
Computerworld (07/24/17) Lamont Wood
View Full Article - May Require Free Registration

Google's Algorithmic Photographers Are Almost as Good as the Real Deal

Motherboard

Google researchers Hui Fang and Meng Zhang have developed an artificial neural network called Creatism, which deconstructs photographic aesthetics into measurable factors that machines could learn using professional artistic examples. The researchers say Creatism "mimics the workflow of a landscape photographer, from framing for the best composition to carrying out various post-processing operations." Fang and Zhang first defined photo aesthetics such as image saturation, composition, and detail levels via formulas, based on examples taken by professional photographers. They then applied a dataset of about 15,000 photo thumbnails of landscape pictures so their neural net could adjust the images to render the most aesthetically pleasing landscape. When assessing 173 photos to choose the best version, professional photographers ranked 41 percent of the images as at or above a semi-professional level, while 13 percent scored significantly higher. Meanwhile, 45 percent of the actual professional photos received the higher professional quality score.

From "Google's Algorithmic Photographers Are Almost as Good as the Real Deal"
Motherboard (07/23/17) Daniel Oberhaus
View Full Article

Computers Using Linguistic Clues to Deduce Photo Content

Phys.org

Researchers at the University of California, Davis and Disney have found that how a person describes the content of a photo can provide important clues for computer-vision programs to ascertain the location of various objects appearing in the image. The researchers found the sentence structure, instead of just the words, of a caption can help a computer determine where within an image a specific object or action is depicted. The team says the program uses deep-learning techniques to analyze the sentence, and employs its hierarchy to better understand spatial relationships and associate each phrase with the appropriate part of the image. A neural network based on this approach could potentially automate the process of annotating images that then can be used to train visual recognition programs. During testing, the researchers demonstrated the new system produced more accurate localizations than baseline systems that do not consider the structure of natural language.

From "Computers Using Linguistic Clues to Deduce Photo Content"
Phys.org (07/21/17)
View Full Article

Five Times the Computing Power

Linkoping University

Researchers at Linkoping University in Sweden have developed a method to increase by a factor of five the computing power of a standard algorithm when performed on a field-programmable gate array (FPGA). The FPGA consists of a matrix of logical gates that can be programmed in situ, and can be reprogrammed an unlimited number of times. "This advance will save huge sums for demanding calculations in industry, and will make it possible to implement new functionality without needing to replace the hardware," says Linkoping University's Oscar Gustafsson. Typically, an algorithm is selected that can carry out the desired calculations, and then the architecture is developed using the required blocks, which is then transferred to the FGPA. The Linkoping researchers changed the signal routes, which delivers to the chip a capacity that is five times greater for each hardware unit.

From "Five Times the Computing Power"
Linkoping University (07/17/17) Monica Westman Svenselius
View Full Article