romkyns
September 14th, 2011, 10:31
A conventional breakpoint is set by replacing an opcode with "int 3". This is fine in an EXE file, but how exactly does this work in a DLL? If I understand correctly, DLLs map the same memory into several processes' address space - this being their main attraction, as this uses the physical memory only once. But surely then an "int 3" instruction would be seen and executed in all processes that share a DLL?
One theory I had was that they are mapped with copy-on-write semantics, where the OS transparently makes a copy of the modified page, and then makes the change. Presumably this would require that the first write faults, the OS makes the copy and then repeats the write invisibly.
If this is so, can I somehow tell if a particular DLL page is "real" or "copied", in OllyDbg? If this is way off, how do DLL breakpoints *actually* work?
One theory I had was that they are mapped with copy-on-write semantics, where the OS transparently makes a copy of the modified page, and then makes the change. Presumably this would require that the first write faults, the OS makes the copy and then repeats the write invisibly.
If this is so, can I somehow tell if a particular DLL page is "real" or "copied", in OllyDbg? If this is way off, how do DLL breakpoints *actually* work?