Skip to content

Latest commit

 

History

History
56 lines (34 loc) · 3.1 KB

File metadata and controls

56 lines (34 loc) · 3.1 KB

Control Hijacking Attacks

Background

Control-hijacking attacks (classic example: buffer overflows) exploit properties of low-level code and system architecture:

  • Many system components are written in C for performance and low-level access. C exposes raw memory addresses and does not enforce bounds checks on pointer arithmetic.
  • Attackers leverage knowledge of calling conventions and architecture details (for example, stack growth direction and stack frame layout) to redirect control flow.

Hardware-assisted bounds checking exists (examples: Intel MPX), but it is not widely deployed due to complexity and overhead.

Defenses and engineering approaches

  • Avoid bugs by design (safer APIs and language choices).
  • Tooling: static analysis (compiler warnings, formal checks) and fuzzing (randomized inputs that exercise corner cases).
  • Prefer memory-safe languages (e.g., Java, C#, Rust) for new development when practical. For low-level components that must use C, use sanitizers and hardened libraries.

How buffer overflows are exploited

  • Objective: gain control of the instruction pointer (return address, function pointers) to transfer execution to attacker-controlled code or to a different location in existing code.
  • Techniques: code injection (put shellcode in memory) or code-reuse (return-to-libc, ROP). Both require knowing or reliably guessing memory addresses.

Mitigations (common techniques)

  1. Stack canaries
  • Insert a secret value (canary) between local buffers and control data (return address). If a buffer overflow overwrites the canary, the runtime detects corruption before using the corrupted return address.
  • Canaries can be static magic values or randomized per-process. They can be bypassed if their value is leaked or predictable, or if the attacker corrupts other control data without touching the canary.
  1. Bounds checking

Ensure pointer dereferences are within the allocation bounds. Implementation strategies include:

  • Electric fence: place an unmapped guard page after each allocation so overflows fault immediately. Useful for debugging, but memory-inefficient.
  • Fat pointers: store base/limit with pointers. Accurate but breaks ABI and complicates concurrency.
  • Shadow metadata: keep allocation size metadata in a separate table and consult it at dereference time.

Baggy Bounds (summary)

  • Round allocations up to a power-of-two size and align them accordingly. Store log2(size) in a compact metadata table indexed by slots (e.g., one byte per 16-byte slot).
  • Example metadata usage:

size = 1 << table[p >> log_of_slot_size] base = p & ~(size - 1) check: (p' >= base) && ((p' - base) < size)

Optimized check (bitwise trick):

(p ^ p') >> table[p >> log_of_slot_size] == 0

Baggy Bounds can be combined with virtual memory protections to make out-of-bounds dereferences fail fast.

Other notes

  • Malloc/free metadata corruption and use-after-free bugs are additional avenues for control-hijacking; hardened allocators and runtime checks help mitigate these.
  • Best practice: layer multiple mitigations (canaries, ASLR, DEP/NX, safe allocators, sanitizers) to increase attack difficulty.