Thursday, 13 October 2016
Bypassing OS Xs Code Signature Check
Bypassing OS Xs Code Signature Check
I was trying to patch Mavericks OSInstall framework to allow installation on MBR partition schemes, but all of my attempts failed. I noticed that the installers error log reported that it failed to verify the code signature of OSInstall. This never happened to me before with Mountain Lion, so I decided to look into some of the changed in Mavericks. A post by oldnapalm on InsanleyMacs forums confirmed that the installer wont launch OSInstall unless it has been signed to reflect the modifications.
Unfortunately I needed a way to sign this file from Linux, and there are no Linux alternative for Macs codesign program. I decided to compare a signed and unsigned version of the same program to see exactly what codesign was changing. Ill try to make this more interesting by adding in some pictures. Heres two copies of the same program, which are currently unsigned:

So I signed one of them and verified its signature:


So theres about 10 locations that differ. Some of these locations contained several bytes, so there was about 40 bytes that differed between them in total. In addition, the signed versions was extended by 9.2 kilobytes of 0x00 characters. I wanted to see what would happen if I changed any arbitrary byte of the signed program and tried to launch it. As expected, the file failed the verification.

So from there I tried changing each altered byte of the signed program back to that of its unsigned versions. Very quickly it stumbled upon the byte that would make the OS think the file wasnt signed.

However, running this file with the 21st byte changed from 0x38 to 0x28 would result in a malformed Mach-o format error when I tried to run the program. Lets exaime these hex values a little bit closer.
0x28 = 00101000
0x38 = 00111000
The 5th bit is the only thing that changed between these two values, so this bit must correspond to the flag that tells the OS that this application is signed.So I tried changing a few more bytes, and managed to find the other byte that needed to be changed.

So finally, after changing the 21st byte from 0x38 to 0x28 and the 17th byte from 0x12 to 0x11, I was able to run a modded program which was signed. Lets look at those last hex values:
0x11 = 00010001
0x12 = 00010010
So the bit that is set in 0x12 but not in 0x11 is the 2nd bit. This must also be related to the the signature check.After discovering this, I tried applying those same changes to Mavericks OSInstall which had previously failed to launch after being modified. Sure enough, it launched without any problems, and I was able to install Mavericks on my MBR disk. Heres the final pseudo code for how to patch modded, signed applications so that they can run:
// Clear code signature bits
Byte 17 &= 0xFD
Byte 21 &= 0xEF
This works because the OS checks to see if these bits are set in order to determine if the application is signed. So theres no need for it to verify the application before launching it because the OS doesnt think its signed.Update: I was working with OSInstall again today, March 5th, 2014, and I was able to get the OS to run a modded version of it after only changing byte 17. So changing byte 21 might not be necessary.
Update: Ended up having some problems with developing an MBR patch for OS X 10.10 Yosemite because of the code signature protection. Ended up running a bunch of tests to find a pattern with how bytes 17 and 21 changed when being signed. Things are working much better now :) Heres the final code. Also value is the offset for the start of the current architecture. So it would change for 32bit/64bit/Compatible.
Theres probably some binary arithmetic that can simplify this algorithm even further, but Im not very interested in creating one right now.if(buffer[value + 0x10] == 0x24) {
buffer[value + 0x10] -= 0x01;
buffer[value + 0x14] >= 0x10 ? buffer[value + 0x14] -= 0x10 : buffer[value + 0x14] += 0xF0;
}
else if(
buffer[value + 0x10] > 0x24)
{
buffer[value + 0x10] -= 0x02;
buffer[value + 0x14] >= 0x20 ? buffer[value + 0x14] -= 0x20 : buffer[value + 0x14] += 0xE0;
}
Update: Someone who did more research than I did left a good comment that explain the flaws in my process. Based on his comment, I guess we can conclude that bits 17 and 21 are not being used as bit field flags to enable the code signature check, however modify them like Ive done will circumvent the security.
Available link for download