When I launched Diablo III, I didn’t expect the pornography I had been looking at hours previously to be splashed on the screen. But that’s exactly what replaced the black loading screen. Like a scene from hollywood, the game temporarily froze as it launched, preventing any attempt to clear the screen. The game unfroze just before clearing the screen, and I was able to grab a screenshot (censored with bright red):
Even though this happened hours later, the contents of the incognito window were perfectly preserved.
So how did this happen? A bug in Nvidia’s GPU drivers. GPU memory is not erased before giving it to an application. This allows the contents of one application to leak into another. When the Chrome incognito window was closed, it’s framebuffer was added to the pool of free GPU memory, but it was not erased. When Diablo requested a framebuffer of it’s own, Nvidia offered up the one previously used by Chrome. Since it wasn’t erased, it still contained the previous contents. Since Diablo doesn’t clear the buffer itself (as it should), the old incognito window was put on the screen again.
In the interest of reproducing the bug, I wrote a program to scan GPU memory for non-zero pixels. It was able to reproduce a reddit page I had closed on another user account a few minutes ago, pixel perfect:
Of course, it doesn’t always work perfectly, sometimes the images are rearranged. I think it has something to do with the page size of memory on a GPU:
This is a serious problem. It breaks the operating system’s user boundaries by allowing non-root users to spy on each other. Additionally, it doesn’t need to be specifically exploited to harm users – it can happen purely by accident. Anyone using a shared computer could be exposing anything displayed on their screen to other users of the computer.
It’s a fairly easy bug to fix. A patch to the GPU drivers could ensure that buffers are always erased before giving them to the application. It’s what an operating system does with the CPU RAM, and it makes sense to use the same rules with a GPU. Additionally, Google Chrome could erase their GPU resources before quitting
I submitted this bug to both Nvidia and Google two years ago. Nvidia acknowledged the problem, but as of January 2016 it has not been fixed. Google marked the bug as won’t fix because google chrome incognito mode is apparently not designed to protect you against other users on the same computer (despite nearly everyone using it for that exact purpose).
This happened to me, but with Safari on my iPhone!
Really? Because then it means that the PowerVR drivers are doing the exact same mistake. I call bullshit on your statement, but in case it’s not: alert Imagination Tech. about this, and Apple.
I also had a similar thing happen to me a few times on iPad Air iOS 8. I saw a frame from a streamed video (that had been opened in chrome incognito, but I think the built-in iOS video player is a culprit in this case). This happened many hours (or even days) after closing all apps in the background. I’m not sure what exactly I did to see that frame, but I think it could be during the switching of apps and a new instance of the video player (with a completely different video) still had an old frame in the zoomed out preview. I also think it was visible in the fullscreen for less than a second after zooming into the video player app.
This has nothing to do with GPU buffer imho but by iOS that takes screenshot of the app in various state (often when application is “closed” or multi tasking view is triggered) and when you launch app again you see the this screenshot for the moment to covert the time when application is loading in the background. You can trigger this fairly easily on slower iOS devices that display the screenshot for longer period of time. Most often the UI just doesn’t seem responsive (because you are clicking on the image not any active controls) but sometimes you see outdated state such as previous incognito session.
It’s a feature, not a bug 😉
“iOS?” He’s referring to OS X.
Latent residual pixel data have been sitting in frame buffers (between application window redraws) for years now.
I have a similar story (about another engineer) from the late 90s, when I was working at a customer demo facility at Silicon Graphics (SGI). It was an 8m wide 2m tall spherical display surface (from a flight simulator), illuminated by three CRT projectors, driven by an SGI Onyx “graphics supercomputer”. The operator did not mute the projectors while the customer’s demo application was loading a large visualisation dataset; when the (OpenGL) application initialised its window, there was a latent image of internet porn (probably newsgroups, not HTTP) on the screen for a fraction of a second, before the application started its render loop.
The latent image looked garbled and color-inverted, and was on screen long enough to be glimpsed, but short enough not to be fully examined. None of the visitors mentioned it, and none of their hosts openly acknowledged it…but everyone saw something. 🙂
The operator’s name was Tyler Durden? It sounds like exactly his job 😀
First and foremost is Chromes fault for NOT CLEARING THE memory it used. All Applications with any programmers with brains has that.
Next up, blizzard, should clear its buffers before use. ( bonehead programming for pre-schoolers…)
And finally the king of all bone heads, Microsoft needs to FORCE the issue with drivers, and have a way to tell any driver to not make bone mistakes. ( btw, the nVidia driver is MICROSOFT CERTIFIED TOILET WRAPPING)
“In graphics programs, clearing the screen (or any of the buffers) is typically one of the”
You do realize he is using Mac OSX?
hahahaha
buUUUUUURN
An Nvidia spokesperson told VentureBeat: “This issue is related to memory management in the Apple OS, not NVIDIA graphics drivers. The NVIDIA driver adheres to policies set by the operating system and our driver is working as expected. We have not seen this issue on Windows, where all application-specific data is cleared before memory is released to other applications.”
So, er, what was that about Microsoft again?
On the face of it, I see no reason why the WebGL context would work any differently than the OpenGL context in this respect.
You might be able to convince Chrome team to fix it if you can demonstrate that grabbing a new frame buffer via WebGL can also return previously stored images.
Do you have a link to the Chrome bug you filed?
https://code.google.com/p/chromium/issues/detail?id=477328
Yes this happened to me on Safari on iPad. I’d close private window but somehow u could see contents of old window in background (blurred) or in history tab with icons of previous web pages…
Slightly different issue, but they should be notified nevertheless.
If “exposing” your porn habit is so embarrassing, perhaps you should fix you first.
I think you’re missing the entire point of this post.
Yup, completely missing the point.
Duh,maybe don’t want to fix,this is about PC stuff not morals dumb ass!
Many people watch porn, be it “normal” porn or hardcore porn, but I know nobody who wants their family to know that. Watching porn is not necessarily a bad thing.
Besides, the point is that the content of your private browsing window (which could be something other than porn but still something you don’t want anyone to see) is exposed to other users on your computer.
Haha good 1 ….
# Roving cynic good 1…. ha ha ha …..
He has no idea, and shouldn’t even be reading this post.
We all take dumps, it’s a complete normal thing, but I wouldn’t want my friends or family to see the color of my shit. Some things belong to the private sphere, and imho they should remain there.
I noticed the same Behaviour years ago when I rebooted(!) my PC – it’s seems to be a side effect of nvidias card design. Since it also affected applications without hardware rendering for me I don’t think Chrome/google can fix this in on their side… https://www.adlerweb.info/blog/2012/06/20/nvidia-x-org-video-ram-information-leak
Do you have the source of the program you wrote to read the non-zeroed GPU memory?
You’re on OSX, so you’re using GPU drivers written by Apple. Send your bug report to them and update the title of your post to reflect this.
Apple wrote only very few of the drivers they use, far from all. The best known one is their own driver for some of Intel’s IGPs, given that Intel’s own drivers for both Windows and OS X were complete rubbish, forcing Apple to get the job done themselves. I doubt they implement their own drivers for AMD and Nvidia GPUs.
Even if Apple uses reference code from NVIDIA without any changes, it’s Apple who has to release the fix as you can’t update drivers on OS X on your own. There is nothing that NVIDIA could do. Clearing memory for a new application is the responsibility of the OS.
Also: has anyone tried to exploit this on AMD/Intel cards on OSX? Does this also happen on NVIDIA on Win/Linux?
Also, it is clearly an outdated version of OS X. Glossy window control buttons are from at least two years ago. How about updating to the free El Capitan, and then check if the problem still exists.
https://hsmr.cc/palinopsia/
The driver cannot clear the memory at resource creation time this would impact performance. Imagine all the open world games that create tons of resources every frame added extra time to clear even though they are eventually going to be written to. This is a significant waste of time. The application should be writing to the buffers prior to displaying them.
Nope.
The OS and drivers should guarantee that processes (especially non-root-processes) can’t read memory written by other processes. For main memory no one would dispute this, it seems like some people haven’t thought about the issue for GPU memory yet..
And clearing memory is really cheap and it’d only have to happen when requesting memory (uploading textures, …) which is not all that cheap anyway, so some additional overhead should be bearable.
I have no idea how GPUs and their driver manage memory, but it could as well happen when releasing memory somewhere, if you think that’d be better for performance 😉
Either way, it has to happen *before* another application gets the memory.
You lost me at: “. . .the pornography I had been looking at. . . .”
Youporn? What are you, some kind of casual?
I have observed the same behavior with Firefox on Linux and a game (‘s ingame editor), also nvidia drivers.
While I think that this happened because of some kind of bug in the game, as you wrote it has security implications and should really be prevented by the OS or driver.
So, if Incognito doesn’t protect you from the internet knowing what you’re doing, and doesn’t protect against the local computer knowing, who’s it supposed to be protecting against?
This isn’t localized to chrome at all. I’ve had this problem with fedora and nvidia. They just have shitty drivers, we’ve known this forever not really worth a blog post.
https://code.google.com/p/chromium/issues/detail?id=477328
“Graphics Card: Dedicated AMD”
Doesn’t mention it affects AMD also.
Ignore this guy, nothing but an AMD shill.
You also have forgotten microsoft’s own product.
This “problem” is by design and will likely never be fixed. The NSA needs this feature to spy, and NVIDIA isn’t allowed to remove it.
So do you normally watch porn?
Unfortunately Google can’t fix this bug. Even if Chrome clears the memory, there’s a very high chance the GPU driver made multiple copies.
Triple buffer, discard-mapped memory, memory management. You could actually have tens of copies in GPU memory, and Chrome will only be able to clear just one of them.
To clear them all Chrome would have to do a lengthy wipe process repeating the process over and over similar to the file shredders; in the hopes that the driver returns the different copies so they all get cleared.
You’re running on OS X and that’s where the problem lies. Apple is the one who controls the drivers in that platform (unlike Windows). NVIDIA and AMD can do little about it.
You have to fill a bug report with Apple (bugreport.apple.com/logon).
Unfortunately both OS X and Linux are a disaster when it comes to GPU management, so I wouldn’t expect this bug to be fixed very soon.
Why shouldn’t nvidia be able to fix this in future driver versions for Linux?
There they have full control over both the kernel module and the libGL, shouldn’t that be sufficient?
Technically, you’re correct. On Linux, NVIDIA should be able of being able to fix this.
But like I said, the state of GPU management is a disaster. I ranted about it on my website (don’t want to do shameless promoting so just google “Maybe it’s time to talk about a new Linux Display Driver Model”).
Also make sure to read the comments since a Linux kernel dev dropped by and left his side of the view.
I imagine the real reason Google marked this as “will not fix” is of course that the bug isn’t in Chrome but in the GPU drivers. If the GPU driver was erasing the buffers correctly then Chrome would be needlessly erasing erased memory.
The fact that the bug disrupts the function of incognito in a use case that incognito was not designed to address (even if it is used that way) simply makes it even less of a Chrome problem.
Fix the GPU. Et voila – Chrome is fixed.
Quote: “Since Diablo doesn’t clear the buffer itself (as it should), the old incognito window was put on the screen again.”
From a programmer’s point of view, 1st and most important should be Blizzar Diablo to fix its “use of memory before initialization” issue, NO programmer should expect system to return clean and wiped memory region by malloc(), the same thing for frame buffer allocated by whatever video card.
2nd may ask Google to flush and clean its display buffers in incognito mode, especially when exiting…
Finally a good reply from a real programmer. Kudos.
Although it was interesting to read the other comments as well.
Of course this seems like a bug in Diablo and you shouldn’t expect the memory the OS gives you to have any specific content (like all 0).
However, as a user you expect that memory from one process doesn’t leak into another process.
So if the GPU driver returns non-zero memory, it should either be from the same process (like a texture you loaded before and unloaded again) or completely random (but zeroing it is cheaper than overwriting with random data, so..)
As you mentioned malloc(), on “modern” operating systems, it will not give you memory contents that were written in another process. It may however contain data you wrote (and free()’d) in the same process.
Memory management model is OS specific. I’m not Mac OS guru so would like to hear from blog owner his OS version.
If OS does mean to wipe all the memory resources and explicitly command that to GPU device driver while GPU driver doesn’t do that accordingly, then turn to Nvidia for the problem. Else if OS does NOT explicitly command GPU driver to wipe memory opportunistically when exiting/terminating application process, turn to Apple for answers.
Given all the above protection is simply relying on memory protection model in kernel privilege, I personally don’t see much advantage even the issue is fixed, no matter by Apple, Nvidia, or Google, a screen snapping program running on backgrounp can capture what you’re browsing in Chrome anyway…you may either ask Google to render web pages in incognito mode by taking use of protected-videe-playback-alike rendering path, or ask Google to render webpages in incognito mode in Trustzone protected memory, LoL
Let’s call it the porNvidia bug, and see them fix it in a day.
Does reading the memory and outputting old non-zero bytes also work with WebGL or do browsers have some kind of additional safeguards there?
That’d make this bug *really* critical.
“It’s what an operating system does with the CPU RAM”
… in debug mode. For speed, it’s not cleared. Notice a pattern?