Today we’re excited to announce the launch of the first core service in our platform: the AI Schema Learning Framework (ASLF).
🎯 What it does:
ASLF ingests raw transaction and network data, normalizes disparate schemas, and generates canonical format outputs that our downstream systems (like RLIE — the Reinforcement Learning Inference Engine) can use for decision-making.
🧱 Initial features in place:
• Service skeleton created — endpoints deployed and running locally
• Ingestion and simple store implemented – raw data landing in our dev store
• Normalize-only endpoint added, enabling conversion of input data to canonical format
• Documentation and sample schema fixtures published — dev team ready for onboarding
✅ Why this matters:
• It lays the foundation for multi-chain interoperability by unifying schema context
• Speed and accuracy of downstream analytics depend on this engine
• Having ASLF up moves us a big step closer to our MVP launch
🚧 Next steps:
• Schema persistence & versioning (transition from in-memory to persistent DB)
• Feedback loop logging and integration with RLIE
• Expand ingestion pipelines to support BTC, ETH, CEX and DEX data feeds
Stay tuned — we’re building fast, and the next phase of the platform is around the corner.
