CURRENT POSITIONING
What we can say cleanly right now
- Work is focused on attention-replacement architecture directions for custom models.
- Primary emphasis is linear scaling and compute-speed advantages in relevant settings.
- Capability claims remain benchmark-dependent and evidence-scoped.