pub struct MetaSlot {
pub free_link: FreeListLink,
pub vtable: AtomicU64,
pub flags: AtomicU32,
pub order: AtomicU8,
pub refcount: AtomicU32,
pub generation: AtomicU32,
pub guard: AtomicU32,
pub meta_aux: AtomicU32,
pub _reserved_tail: [u8; 16],
/* private fields */
}Expand description
64-byte cache-line metadata for one physical frame (issue #38).
Layout: free-list links + flags + refcount + optional vtable + generation + reserved tail for future guard bits / generational references without touching the page payload.
Use plain #[repr(C)] (not align(64) on the struct): align(64) would pad the type
size to a multiple of 64 and can inflate size_of to 128. The metadata array is
still allocated with FRAME_META_ALIGN so each slot stays cache-line aligned.
Field order matters: vtable immediately follows free_link so AtomicU64 stays
8-byte aligned without hidden padding after refcount (which would inflate the struct
to 72 bytes).
Fields§
§free_link: FreeListLink§vtable: AtomicU64*const FrameMetaVtable as bits; 0 means DEFAULT_FRAME_META_VTABLE.
flags: AtomicU32§order: AtomicU8§refcount: AtomicU32§generation: AtomicU32Bumps each time the frame is successfully claimed from the buddy free list
(see MetaSlot::note_new_allocation_epoch).
guard: AtomicU32Kernel-owned guard bits (meta_guard); independent of frame_flags.
meta_aux: AtomicU32Low 16 bits: owner CPU id hint (issue #38); upper bits reserved / NUMA placeholder.
_reserved_tail: [u8; 16]Implementations§
Source§impl MetaSlot
impl MetaSlot
Sourcepub const REFCOUNT_BYTE_OFFSET: usize = META_SLOT_REFCOUNT_BYTE_OFFSET
pub const REFCOUNT_BYTE_OFFSET: usize = META_SLOT_REFCOUNT_BYTE_OFFSET
Byte offset of refcount from the start of MetaSlot (same as META_SLOT_REFCOUNT_BYTE_OFFSET).
Sourcepub fn reset_with_free_list_meta(&self)
pub fn reset_with_free_list_meta(&self)
Reset vtable/guard when returning a frame to the buddy free list (buddy::set_block_meta).
Preserves meta_guard::POISONED so poisoned frames are not silently « healed » on free.
pub fn meta_aux_load(&self) -> u32
pub fn meta_aux_store(&self, v: u32)
Sourcepub fn note_new_allocation_epoch(&self)
pub fn note_new_allocation_epoch(&self)
After a successful CAS(REFCOUNT_UNUSED → 1) in FrameAllocOptions::allocate,
start a new metadata epoch: default vtable, clear guards, bump generation.
The generation bump uses Ordering::Release so another CPU that later
[Acquire]-loads Self::generation or pairs with the refcount hand-off sees this
epoch for genealogical use-after-free checks. Ordering::Relaxed would be enough only
if all such checks ran on the allocating CPU with no cross-CPU visibility requirement.
pub fn get_guard(&self) -> u32
pub fn set_guard(&self, bits: u32)
pub fn fetch_or_guard(&self, bits: u32) -> u32
Sourcepub fn is_guard_poisoned(&self) -> bool
pub fn is_guard_poisoned(&self) -> bool
Returns true if meta_guard::POISONED is set.
Sourcepub fn mark_poisoned(&self)
pub fn mark_poisoned(&self)
Marks both meta_guard::POISONED and frame_flags::POISONED (corruption / audit path).
Sourcepub fn debug_snapshot(&self) -> (u32, u32, u64)
pub fn debug_snapshot(&self) -> (u32, u32, u64)
(generation, guard_bits, vtable_bits) for serial / shell diagnostics.
Sourcepub fn vtable_bits(&self) -> u64
pub fn vtable_bits(&self) -> u64
Raw vtable pointer bits (0 = default).
Sourcepub fn set_vtable_bits(&self, bits: u64)
pub fn set_vtable_bits(&self, bits: u64)
Install a custom vtable pointer (must point to a 'static FrameMetaVtable).
Sourcepub fn vtable_ref(&self) -> &'static FrameMetaVtable
pub fn vtable_ref(&self) -> &'static FrameMetaVtable
Resolved vtable reference (0 bits map to DEFAULT_FRAME_META_VTABLE).
Misaligned or otherwise invalid non-zero pointer bits fall back to the default vtable
(same as Self::try_vtable_ref returning None).
Sourcepub fn try_vtable_ref(&self) -> Option<&'static FrameMetaVtable>
pub fn try_vtable_ref(&self) -> Option<&'static FrameMetaVtable>
Like Self::vtable_ref, but returns None if non-zero vtable bits are not aligned
to a FrameMetaVtable pointer (8-byte aligned).
Sourcepub fn generation(&self) -> u32
pub fn generation(&self) -> u32
Loads the allocation generation with Ordering::Acquire, pairing with the
Ordering::Release bump in Self::note_new_allocation_epoch for cross-CPU checks.
Sourcepub fn set_generation(&self, g: u32)
pub fn set_generation(&self, g: u32)
Overwrites the generation counter : only for boot-time init or tests.
Normal allocations bump generation via MetaSlot::note_new_allocation_epoch.
Arbitrary values break « generational » use-after-free checks.
pub fn next(&self) -> u64
pub fn set_next(&self, next: u64)
pub fn prev(&self) -> u64
pub fn set_prev(&self, prev: u64)
pub fn inc_ref(&self)
pub fn dec_ref(&self) -> u32
pub fn get_refcount(&self) -> u32
pub fn set_flags(&self, flags: u32)
pub fn get_flags(&self) -> u32
pub fn get_order(&self) -> u8
pub fn set_order(&self, order: u8)
pub fn set_refcount(&self, count: u32)
pub fn cas_refcount(&self, expect: u32, new: u32) -> Result<u32, u32>
pub fn reset_refcount(&self)
pub fn is_cow(&self) -> bool
pub fn is_dll(&self) -> bool
Auto Trait Implementations§
impl !Freeze for MetaSlot
impl RefUnwindSafe for MetaSlot
impl Send for MetaSlot
impl Sync for MetaSlot
impl Unpin for MetaSlot
impl UnsafeUnpin for MetaSlot
impl UnwindSafe for MetaSlot
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more