Linked by Thom Holwerda on Tue 22nd May 2012 23:26 UTC
Internet & Networking "Just over two months ago, Chrome sponsored the Pwnium browser hacking competition. We had two fantastic submissions, and successfully blocked both exploits within 24 hours of their unveiling. Today, we'd like to offer an inside look into the exploit submitted by Pinkie Pie." A work of pure art, this. Also, this is not the same person as the other PinkiePie. Also also, you didn't think I'd let a story with a headline like this go by unnoticed, did you?
Order by: Score:
Comment by Radio
by Radio on Wed 23rd May 2012 06:42 UTC
Radio
Member since:
2009-06-20

Also also, you didn't think I'd let a story with a headline like this go by unnoticed, did you?

Red paint, girlscout, etc?

More seriously, the GPU is *again* the weak link. That is cause for concern for the security of modern browsers: is it manageable when they have so much code touching so many hard/soft wares?

Reply Score: 5

RE: Comment by Radio
by moondevil on Wed 23rd May 2012 07:47 UTC in reply to "Comment by Radio"
moondevil Member since:
2005-07-08

Actually C and C++ are the weakest links, not the GPU, as the exploits take advantage of the pointer tricks so dear to C and C++ developers.

If ComputeMaxResults() was done in a more sane language, this exploit wouldn't have been possible, without doing some Assembly code rewriting.

Reply Score: 1

RE[2]: Comment by Radio
by kwan_e on Wed 23rd May 2012 08:14 UTC in reply to "RE: Comment by Radio"
kwan_e Member since:
2007-02-18

If ComputeMaxResults() was done in a more sane language, this exploit wouldn't have been possible, without doing some Assembly code rewriting.


Did you actually read the functions? It is a calculation logic error. There is no language alive to prevent logic errors. The logic error results in an invalid buffer access for a GPU related task. No "sane" language has yet been extended to use GPUs that do no rely on creating buffers directly at some point in its execution.

You do understand that were a managed language required to access the GPU, it would also need to do manual memory management undercovers, don't you?

Reply Score: 6

RE[3]: Comment by Radio
by moondevil on Wed 23rd May 2012 09:32 UTC in reply to "RE[2]: Comment by Radio"
moondevil Member since:
2005-07-08

Did you actually read the functions? It is a calculation logic error. There is no language alive to prevent logic errors. The logic error results in an invalid buffer access for a GPU related task. No "sane" language has yet been extended to use GPUs that do no rely on creating buffers directly at some point in its execution.


Yes I've read the functions, ComputeMaxResults() and ComputeSize(),
are the standard way to manipulate blocks of memory/arrays in C and related to the way arrays decay into pointers.


You do understand that were a managed language required to access the GPU, it would also need to do manual memory management undercovers, don't you?


Safe programming languages != GC != Managed.

Ada, Modula-2, Delphi, Turbo Pascal are safe programming languages with manual memory management, compiling nicely to native code as well, just as an example.

Reply Score: 2

RE[4]: Comment by Radio
by kwan_e on Wed 23rd May 2012 09:36 UTC in reply to "RE[3]: Comment by Radio"
kwan_e Member since:
2007-02-18

Ada, Modula-2, Delphi, Turbo Pascal are safe programming languages with manual memory management, compiling nicely to native code as well, just as an example.


And do they prevent you from making the LOGIC ERROR that was explained about the functions?

Reply Score: 2

RE[5]: Comment by Radio
by moondevil on Wed 23rd May 2012 11:09 UTC in reply to "RE[4]: Comment by Radio"
moondevil Member since:
2005-07-08

And do they prevent you from making the LOGIC ERROR that was explained about the functions?


YES! Because the LOGIC ERROR is about a MEMORY ACCESS ALGORITHM known to ANY C PROGRAMMER.

Reply Score: 3

RE[6]: Comment by Radio
by kwan_e on Wed 23rd May 2012 11:17 UTC in reply to "RE[5]: Comment by Radio"
kwan_e Member since:
2007-02-18

YES! Because the LOGIC ERROR is about a MEMORY ACCESS ALGORITHM known to ANY C PROGRAMMER.


Really.

static uint32 ComputeMaxResults(size_t size_of_buffer) { return (size_of_buffer - sizeof(uint32)) / sizeof(T); }

So, say, Delphi, would prevent someone from making a mistake doing a subtraction and then a division, knowing what the calculation would be used for, would it?

size_of_buffer is an integer.
So is sizeof(uint32).
So is sizeof(T).

Are you seriously telling me there are programming languages out there that would actually tell the programmer "hey, did you know size_of_buffer you passed in was smaller than the size of uint32, and I checked every usage of ComputeMaxResults and I've noticed these unsafe uses?"

Reply Score: 4

RE[7]: Comment by Radio
by moondevil on Wed 23rd May 2012 13:11 UTC in reply to "RE[6]: Comment by Radio"
moondevil Member since:
2005-07-08

Yes, because if you really cared to read everything, you will see that the outcome of such functions is used for buffer manipulation tricks.

This calculation then overflowed and made the result of this function zero, instead of a value at least equal to sizeof(uint32). Using this, Pinkie was able to write eight bytes of his choice past the end of his buffer. The buffer in this case is one of the GPU transfer buffers, which are mapped in both processes’ address spaces and used to transfer data between the Native Client and GPU processes. The Windows allocator places the buffers at relatively predictable locations; and the Native Client process can directly control their size as well as certain object allocation ordering. So, this afforded quite a bit of control over exactly where an overwrite would occur in the GPU process.

The next thing Pinkie needed was a target that met two criteria: it had to be positioned within range of his overwrite, and the first eight bytes needed to be something worth changing. For this, he used the GPU buckets, which are another IPC primitive exposed from the GPU process to the Native Client process. The buckets are implemented as a tree structure, with the first eight bytes containing pointers to other nodes in the tree. By overwriting the first eight bytes of a bucket, Pinkie was able to point it to a fake tree structure he created in one of his transfer buffers. Using that fake tree, Pinkie could read and write arbitrary addresses in the GPU process. Combined with some predictable addresses in Windows, this allowed him to build a ROP chain and execute arbitrary code inside the GPU process.


A safer language would have a runtime error when such situations get detected.

The logic error as you called is only required, because they need to calculate specific values for pointer math. Without pointer math no need for logic errors that turn into buffer exploits.

Reply Score: 2

RE[7]: Comment by Radio
by anevilyak on Wed 23rd May 2012 17:53 UTC in reply to "RE[6]: Comment by Radio"
anevilyak Member since:
2005-09-14


Really.

static uint32 ComputeMaxResults(size_t size_of_buffer) { return (size_of_buffer - sizeof(uint32)) / sizeof(T); }

So, say, Delphi, would prevent someone from making a mistake doing a subtraction and then a division, knowing what the calculation would be used for, would it?


I don't know about Delphi specifically, but for most of the languages that are considered "safe" in this respect, you're not allowed to pass around raw blocks of memory without any type safety/control. As a consequence, the language's runtime always knows and tracks the size of any currently allocated arrays. Given this knowledge, the runtime effectively converts any instance of array[x] to something analogous to:

if (x < 0 || x >= arraySize)
raise_runtime_error();
else
return array[x];

Obviously this incurs some overhead compared to the raw C version, but it does indeed prevent these kinds of problems. So yes, it really is possible to handle in a safe/high level language, if the logic error in question is confined to the level of the language. However, if the logic error is actually in the instructions passed to the GPU itself, and the latter is able to access arbitrary areas of memory, then that's a different story.

Edited 2012-05-23 17:54 UTC

Reply Score: 2

RE[4]: Comment by Radio
by looncraz on Wed 23rd May 2012 17:02 UTC in reply to "RE[3]: Comment by Radio"
looncraz Member since:
2005-07-24

**ALL** languages talk to the hardware the same way!

Memory pointers are a hardware feature. They just happen to be 'exposed' in c/++. I can use assembly to fool any programming written in any language that I am friendly and of the same breed and interject myself using hardware features.

Security like you are speaking would require a massive hardware change where memory is addressed in locked relative-memory segments. That is every process can create restricted areas of memory which can only be accessed by a compile-time validated code-path - something a few steps beyond nX-bits.

Even then, you would only need to find a way to inject yourself into that code-path by altering the binary... but that would be easier to catch and prevent. You would then need to rely on OS/hardware bugs... but you would still be able to get 'in' in some manner...nothing is safe beyond not permitting execution at all...

Which language was used simply doesn't mean squat. A program written in Delphi still debases itself to the same approximate code which was written in c. I'm still using movl %eax, %ecx, call, jmp, etc...

--The loon

Edited 2012-05-23 17:02 UTC

Reply Score: 2

RE[3]: Comment by Radio
by panzi on Wed 23rd May 2012 20:39 UTC in reply to "RE[2]: Comment by Radio"
panzi Member since:
2006-01-22

You say there is no known language where this calculation would return the right result? Obviously you don't know Python or Ruby. These language have variable length integers which means that you never have a integer overflow/underflow.

Yes, the result is then a negative number. But given the definition of the function and the parameters the result is "correct". And in Ruby/Python you don't have any buffers through which you can access arbitrary memory anyway.

Reply Score: 2

RE[4]: Comment by Radio
by kwan_e on Wed 23rd May 2012 23:14 UTC in reply to "RE[3]: Comment by Radio"
kwan_e Member since:
2007-02-18

You say there is no known language where this calculation would return the right result? Obviously you don't know Python or Ruby. These language have variable length integers which means that you never have a integer overflow/underflow.


I've programmed in Python. I love Python. How would you suggest Python be able to directly instruct the GPU? I'll give you a hint: you write the extension in C.

This is the cause of my earlier lament. People like you treat it as though languages like Python and Ruby magically spawn out of nowhere without having anything to do with C.

Yes, the result is then a negative number. But given the definition of the function and the parameters the result is "correct".


But given the PURPOSE of the function, the "correct" answer is wrong. And you'll end up with the same problem of incorrectly addressing the buffers contents.

So it is a LOGIC error. The formula is WRONG. Languages cannot fix wrong formulae, which is the heart of the problem with the function.

And in Ruby/Python you don't have any buffers through which you can access arbitrary memory anyway.


Unless you write an extension in C, which you pretty much have to do if you want it to talk to the GPU.

This is why one of the earlier commenters was right. This is about the GPU. Not the language.

Reply Score: 3

RE: Comment by Radio
by renox on Wed 23rd May 2012 07:59 UTC in reply to "Comment by Radio"
renox Member since:
2005-07-06

More seriously, the GPU is *again* the weak link. That is cause for concern for the security of modern browsers: is it manageable when they have so much code touching so many hard/soft wares?


And it's only a start: when I read about Firefox's developpers working on WebGL, I immediately thought: this feature has a lot of potential security issues..

Reply Score: 2

Not the same, but...
by Savior on Wed 23rd May 2012 08:08 UTC
Savior
Member since:
2006-09-02

Maybe not the same guys, but they obviously got the inspiration from the same place (see http://arstechnica.com/business/2012/03/googles-chrome-browser-on-f... ). Which is all right, because Pinkie Pie is awesome.

Reply Score: 3

Not sure how I feel about exploits
by FunkyELF on Wed 23rd May 2012 19:26 UTC
FunkyELF
Member since:
2006-07-26

If the phones and game consoles I bought came with root access I'd be against exploits. But as it stands, exploits are what allows me to root these things.

Reply Score: 2