Computer scientists at the University of California, Riverside, for the first time revealed how easily attackers can use a computer graphics processor (GPU) to spy on the Internet, steal passwords and hack into cloud-based applications.
Marlan and Rosemary Bourns College of Computer Engineering PhD student Hoda Naghibijouybari and postdoctoral researcher Ajaya Neupane, together with Professor Zhiyun Qian and Professor Nael Abu-Ghazalehem, prepared the Nvidia graphics processor to demonstrate three attacks on graphics and heaps of computing, as well as on the other side. The group believes that these are the first reported attacks on the GPU in the general channel.
All three attacks require that the victim first get a malicious program embedded in the downloaded application. The program aims to spy on the victim's computer.
Web browsers use GPU to render graphics on desktops, laptops and smartphones. GPUs are also used to speed up applications in the cloud and data centers. Web graphics can reveal user information and activity. GPU-enhanced computational burdens are applications that contain sensitive data or algorithms that may be exposed to new attacks.
GPUs are usually programmed using application programming interfaces or APIs such as OpenGL. OpenGL is available for every application on the desktop with user-level permissions, so that all attacks are virtually available on the computer. Because desktops or laptops are shipped with installed graphics libraries and drivers by default, the attack can be easily implemented using the graphics APIs.
The first attack tracks users' activity on the Internet. When the victim opens a malicious application, he uses OpenGL to create a spy to deduce the behavior of the browser when using the GPU. Each site has a unique footprint in terms of the use of GPU memory due to the different number of objects and different sizes of rendered objects. This signal is consistent when recharging the same web page repeatedly and does not affect caching.
Researchers monitored GPU memory allocation in time or GPU performance counters and passed these functions to a machine learning classifier, achieving high fingerprint accuracy. The spy can reliably get all the allocation events to see what the user was doing on the network.
In the second attack, the authors have extracted user passwords. Each time a user enters a character, the entire text field of the password will be sent to the GPU as the texture to render. Monitoring the interval of successive memory allocation events revealed the number of characters of the password and the interval between keystrokes, well-known techniques for learning passwords.
The third attack concerns a cloud computing application. The attacker runs a harmful computing load on the GPU running in parallel with the victim's application. Depending on the parameters of the neural network, the intensity and pattern of competition in the cache, memory and functional units differ in time, creating a measurable leak. The attacker uses a machine-based classification on the performance counter to extract the secret structure of the victim's neuron network, such as the number of neurons in a particular layer of the deep neural network.
Researchers reported their findings to Nvidia, which responded that they intend to publish a patch that offers system administrators the option to disable access to performance counters from user-level processes. They also provided an article draft of AMD and Intel security teams to enable them to evaluate their GPUs for such gaps.
In the future, the group plans to test the feasibility of side-channel GPU attacks on Android phones.
The article "Rendered unforeseen: attacks from the parallel channel GPU are practical" was presented at the ACM SIGSAC conference on computer security and communications on October 15-19, 2018 in Toronto, Canada. The study was supported by the National Science Foundation Grant CNS-1619450.
Source of history:
Materials provided by University of California – Riverside. Original written by Holly Ober. Note: The content can be edited due to style and length.