Bloomberg Technology

New AI Model Lets Robot Neo Learn Tasks From Scratch

Humanoid maker 1X Technologies has introduced a new AI model that it says allows its home robot Neo to perform new tasks from a video or voice prompt even if the robot has zero experience doing the task before. 1X Technologies CEO and CTO Bernt Børnich joins Caroline Hyde and Ed Ludlow on “Bloomberg Tech.”…

Published

on

Humanoid maker 1X Technologies has introduced a new AI model that it says allows its home robot Neo to perform new tasks from a video or voice prompt even if the robot has zero experience doing the task before. 1X Technologies CEO and CTO Bernt Børnich joins Caroline Hyde and Ed Ludlow on “Bloomberg Tech.”
——–
Like this video? Subscribe to Bloomberg Technology on YouTube:

 
Watch the latest full episodes of “Bloomberg Technology” with Caroline Hyde and Ed Ludlow here:

 
Get the latest in tech from Silicon Valley and around the world here:

Connect with us on…
X:
Facebook:
Instagram:
 
Follow Ed Ludlow on X here:
Follow Caroline Hyde on X here:
 
Listen to the daily Bloomberg Technology podcast here:

 
More from Bloomberg Business
Connect with us on…
X:
Facebook:
Instagram:
LinkedIn:
TikTok:

17 Comments

  1. @hamzaouamrouche57

    January 12, 2026 at 1:50 pm

    Scratch Neo bot learning

  2. @CluicheClair

    January 12, 2026 at 1:53 pm

    Wow, how quickly they moved on from the remote operator in your home, like a matter of months…I hope this doesn’t turn out to be a marketing exercise.

    • @CJayyTheCreative

      January 12, 2026 at 2:26 pm

      Yeah it’s scary the progress. Now the robot can actually not rely on teleoperation apart from a few mistakes. It can be a useful tool.

      It’s very scary kind of what we’re seeing

    • @jimj2683

      January 12, 2026 at 4:00 pm

      That CEO is legit! He is a robotics engineer by education and has been dreaming of doing this since he was a kid. He is not a shrewd businessman following the latest hype.

  3. @ChrisCoombes

    January 12, 2026 at 2:03 pm

    Is it me or does he not really answer the questions?

    • @Cabumby

      January 12, 2026 at 4:43 pm

      It’s just you, he answers them, but also talks up his lil guy at the same time lol

  4. @Slickpete83

    January 12, 2026 at 2:19 pm

    *This guy baseball hat and t-shirt , tech bro doesn’t know how to dress respectfully on Bloomberg* hahaha..

    • @Intinnent

      January 12, 2026 at 3:04 pm

      Don’t waste decisions on bs

    • @QSpark-p2m

      January 12, 2026 at 4:50 pm

      theyre clothes with his brand on

  5. @CJayyTheCreative

    January 12, 2026 at 2:25 pm

    How is this not AGI? Apart from a few mistakes this is literally a robot that can think and reason and take action on its own, and its capabilities are generalized so it can do tasks even if it hasn’t been exposed to them.

    This is a turning point in robotics. This is like the iPhone moment, it really is it’s just less flashy. Because now we will see home robots this year in 2026 doing chores without teleoperation. I am rejoicing !

    • @KurumaDesigns

      January 12, 2026 at 4:13 pm

      I think he means it’s not AGI because you have to give the robot a prompt for it to do stuff, it doesn’t try out new stuff on its own

    • @Cabumby

      January 12, 2026 at 4:40 pm

      I think this is not quite agi because its inteligence is based on limited length predictive video, and it lacks a real sense of time. This model uses a video based network to do all of the “creative” problem solving, while translation networks turn that approximation into real movements. The fact that the action must be planned all at once by an external model that doesn’t update dynamically through said task takes away a lot of “generality”. The biggest restraint I’d argue is the lack of general temporal awareness. The robot can’t flow through movements live, it has to process one set of actions, then do it all in one go. I agree that this is a turning point, because this is the feedback loop that humans run on, but just sped up and more integrated. I think we’re in the integration hell of robotics, but we’re getting there!

    • @JeremyJackson111

      January 12, 2026 at 6:52 pm

      The model does not have continual learning. It’s still a pre trained model.

    • @MuromachiLines

      January 12, 2026 at 6:54 pm

      According to DeepMind, stardard human capabilities are called minimal AGI,

  6. @Duxzenji

    January 12, 2026 at 4:52 pm

    It’s never been more Joever than it is right now.

    Happy 2026 😂

  7. @Spiggle.

    January 12, 2026 at 5:49 pm

    How much longer till I can get my reasonably priced Jenny Wakeman companion bot that is fully autonomous? 🥰

  8. @bolabola9354

    January 12, 2026 at 7:49 pm

    Unless they are lying, this is a humongous deal, right?

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version