
    iu                     T    d Z ddlZddlmZ ddlmZ ddlmZ ddlmZm	Z	 de
d	e
fd
Zy)u5  
Genesis Persistent Context Architecture — JIT Hydration Master Function
Story 2.02 — Track B

interceptor_jit_hydration() — Master JIT Hydration Function.
Assembles <ZERO_AMNESIA_STATE> envelope from all memory layers concurrently.
Every LLM call starts with full context in <50ms.

Pipeline:
    1. fast_extract(task_payload)        -> (target_entities, intent_string)
    2. scatter_gather_memory(...)        -> MemoryContext  [3 layers, 45ms timeout]
    3. build_envelope(memory_context)   -> XML string
    4. Attach as task_payload["system_injection"]
    N)Optional   )fast_extract)scatter_gather_memory)build_envelopeMemoryContexttask_payloadreturnc                    K   t        |       \  }}| j                  dd      }t        |||d       d{   }t        |      }|| d<   | S 7 w)u  
    Master JIT hydration function.

    Orchestrates the full memory assembly pipeline for a single LLM call:

    Step 1 — fast_extract() (synchronous, pure Python, <1ms)
        Parses the task payload and extracts entity names and an intent string.
        Never makes I/O calls. Safe to run in the hot path.

    Step 2 — scatter_gather_memory() (async, 45ms wall-clock budget)
        Fires L1 (Redis), L2 (KG JSONL), and L3 (Qdrant) fetches concurrently.
        Each layer has its own timeout. A slow or offline layer returns None
        without blocking the others. MemoryContext always returns with whatever
        data arrived in time — graceful degradation is built into scatter_gather.

    Step 3 — build_envelope() (synchronous, pure Python, <1ms)
        Assembles the MemoryContext into a well-formed <ZERO_AMNESIA_STATE> XML
        string. None fields are replaced with canonical fallback strings so the
        envelope is always non-empty and valid.

    Step 4 — Attach envelope to task_payload["system_injection"]
        The original payload is mutated in-place and returned. All pre-existing
        keys are preserved.

    Args:
        task_payload: Dict representing the LLM task. May contain any of:
            - "task_id"      : str  — used as Redis L1 lookup key
            - "prompt"       : str  — used for entity extraction + intent
            - "description"  : str  — fallback text source for entity extraction
            - "task"         : str  — fallback text source for entity extraction
            - "file"         : str  — text source for entity extraction
            - "code"         : str  — text source for entity extraction
            - "context"      : str  — text source for entity extraction

    Returns:
        The same task_payload dict with one additional key:
            task_payload["system_injection"] — str, the XML envelope
    task_idunknown-   )r   target_entitiesintent_string
timeout_msNsystem_injection)r   getr   r   )r	   r   r   r   memory_contextenvelopes         2/mnt/e/genesis-system/core/memory/jit_hydration.pyinterceptor_jit_hydrationr      sk     P &2,%?"O] y)4G*?'#	+ %N #>2H (0L#$%s   3AAA)__doc__asynciotypingr   r   scatter_gatherr   zero_amnesia_enveloper   r   dictr        r   <module>r       s.      & 1 @9$ 94 9r   