Albert Saniger, founder and former CEO of AI shopping startup Nate, faces federal fraud charges for allegedly misleading investors about the company’s technology.
The U.S. Department of Justice unsealed an indictment Wednesday, accusing Saniger of falsely claiming Nate’s app used fully automated artificial intelligence for checkout processes while relying heavily on human labor.
Nate, launched in 2018, raised over $50 million from investors, including a $38 million Series A round led by Renegade Partners in 2021. The app promised a seamless, AI-driven one-click checkout across e-commerce platforms.
Prosecutors, however, revealed that the platform depended on hundreds of contractors in the Philippines to manually complete purchases, with automation rates described as “effectively 0%.” Saniger allegedly assured investors that transactions operated “without human intervention,” a claim contradicted by internal operations exposed in a 2022 report by The Information.
Despite hiring data scientists and acquiring AI tools, Nate’s technology failed to match its marketing, according to the DOJ. The company depleted funds and sold its assets in early 2023, leaving investors with near-total losses. Saniger, now a managing partner at venture firm Buttercore Partners, has not publicly commented on the charges. Buttercore also declined to respond.
The case reflects heightened scrutiny of startups accused of overstating AI capabilities to secure funding. Similar controversies, such as a drive-through software startup exposed for using human workers instead of AI, underscore growing concerns about transparency in tech innovation. As global investment in AI surges, regulators and investors increasingly demand clearer distinctions between genuine technological advancement and deceptive marketing.
Saniger’s indictment marks a pivotal moment in addressing accountability within the tech sector, emphasizing the risks of prioritizing hype over substance. Legal experts suggest the outcome could influence how AI-driven enterprises are evaluated, potentially reshaping due diligence practices to prevent future misrepresentation. This development aligns with broader efforts to ensure ethical standards in emerging technologies, balancing innovation with investor and consumer trust.