Now Reading: Ainekko, Veevx Combine to Advance Embedded AI and Memory Tech

Loading
svg

Ainekko, Veevx Combine to Advance Embedded AI and Memory Tech

NewsJanuary 30, 2026Artifice Prime
svg19

Brings together Raspberry Pi DNA, MRAM-powered AI memory, and open source hardware to redefine the future of edge silicon

Ainekko, a startup pioneering open, software-defined AI infrastructure, today announced it is merging with Veevx, a fabless semiconductor company known for its cutting-edge embedded AI solutions and MRAM-based memory innovations. Douglas Smith, co-founder of Veevx and a foundational engineer who for many years managed the intellectual property at Broadcom, will join Ainekko as an executive team member to help lead the next chapter of developer-driven, community-powered silicon.

The combined company will operate under the Ainekko name and continue its mission to democratize silicon by delivering flexible, software-defined hardware platforms that empower developers, startups, and researchers to co-design and build the next generation of chips.

The merger brings together two visionary teams: Ainekko’s open-source platform for AI-native silicon and Veevx’s deeply embedded AI expertise, including their proprietary iRAM memory technology and low-power AI accelerator, co-packaged with microcontrollers. The merger follows Ainekko’s recent acquisition of Esperanto Technologies’ intellectual property, including a manycore RISC-V chip architecture and toolchain. Together with Veevx, Ainekko is creating a full-stack, open silicon platform optimized for rapid AI innovation at the edge.

“We’re doing for AI hardware what Linux did for operating systems and Kubernetes did for cloud infrastructure,” said Tanya Dadasheva, Co-founder and CEO of Ainekko. “With the addition of Veevx’s architecture team, MRAM-based memory IP, and Douglas’s leadership, we’re combining advanced compute and storage with radical openness. This is not another chip company but an open foundation for the next generation of intelligent devices.”

With AI workloads increasingly bottlenecked by memory bandwidth and power constraints, Veevx’s MRAM-based iRAM technology is a breakthrough. Unlike traditional SRAM or costly DRAM stacks, iRAM delivers high-density, non-volatile memory with SRAM-like performance which is ideal for edge AI and embedded inference. This positions Ainekko to offer an open, energy-efficient compute stack that includes memory, processing, and tools in a world where memory increasingly defines performance.

“We built Veevx to bring high-performance, low-power AI to embedded systems that need real intelligence at the edge,” said Douglas Smith. “Combining our expertise in memory and inference acceleration with Ainekko’s open, community-first approach creates something truly powerful. This is not just a merger but a leap forward for AI hardware innovation.”

The combined platform will enable:

  • Intelligent memory subsystems (like iRAM) that replace power-hungry SRAM with energy-efficient, scalable alternatives
  • Embedded accelerators optimized for inference at the edge, co-packaged with microcontrollers
  • Open-source RTL, emulation tools, and developer resources that allow full customization
  • A community-led roadmap that evolves alongside real-world AI workloads

Ainekko’s approach mirrors what Linux did for operating systems and Kubernetes did for the cloud: it shifts control from proprietary vendors to developers and innovators. The next era of semiconductors will be:

  • AI-generated: Chips co-designed by AI based on real-world applications
  • Open by default: Built on accessible RTL and community-owned tooling
  • Edge-native: Focused on low-power, scalable inference beyond the data center
  • Community-led: Enabling builders who don’t want to wait for permission

Ainekko isn’t another chip company. It’s the first open platform for AI-native silicon, a category-defining foundation that makes chip design fast, collaborative, and accessible. With Veevx, Ainekko accelerates its roadmap and strengthens its bet on open silicon as the inevitable next platform shift.

Upcoming Community Events
Ainekko will participate in FOSDEM’26, one of the world’s largest open-source conferences, continuing its commitment to open infrastructure and community-led innovation.

  • FOSDEM’26 – AI Plumbers Devroom
    January 31, 2026 (Saturday)
    10:30 a.m. – 7:00 p.m. CET
    Brussels, Belgium
  • FOSDEM’26 Fringe – AI Plumbers (Un)conference
    February 2, 2026 (Monday)
    9:00 a.m. – 5:00 p.m. CET
    Brussels, Belgium

Members of the Ainekko team will be on site to engage with developers, researchers, and partners working on open AI infrastructure, silicon, and systems-level tooling. More details at https://aifoundry.org/#events.

The post Ainekko, Veevx Combine to Advance Embedded AI and Memory Tech first appeared on AI-Tech Park.

Origianl Creator: GlobeNewswire
Original Link: https://ai-techpark.com/ainekko-veevx-combine-to-advance-embedded-ai-and-memory-tech/
Originally Posted: Fri, 30 Jan 2026 11:45:00 +0000

0 People voted this article. 0 Upvotes - 0 Downvotes.

Artifice Prime

Atifice Prime is an AI enthusiast with over 25 years of experience as a Linux Sys Admin. They have an interest in Artificial Intelligence, its use as a tool to further humankind, as well as its impact on society.

svg
svg

What do you think?

It is nice to know your opinion. Leave a comment.

Leave a reply

Loading
svg To Top
  • 1

    Ainekko, Veevx Combine to Advance Embedded AI and Memory Tech

Quick Navigation