Columns:
| # | Tweet | User | Followers | Views ▼ | Ratio | Engagement | Posted |
|---|---|---|---|---|---|---|---|
| 1 | [video] Dream2Flow from Stanford University marks a major breakthrough:
Simply put, humanoid robots can learn physical movements from videos generated via text/images, and subsequently execute tasks successfully in real-world environments.
This significantly reduces the effort required | @CyberRobooo ✓ | 36.4K | 5.4K | 0.1x | 53 | Mar 22 |
| 2 | [video] Just discovered SimToolReal:an insane breakthrough in robot tool manipulation
No more task-specific training,it uses sim-to-real RL to handle unseen tools zero-shot. Hammers, scissors, you name it,over 120 real-world demos crushing it.
more:
Mind-blown | @CyberRobooo ✓ | 35.4K | 5.1K | 0.1x | 78 | Feb 25 |
| 3 | [video] Humanoid robots often look capable, but break down on contact.
This project 👉 adds a key piece:
predicting future touch during action — Touch Dreaming.
Instead of
see → act
it becomes
see + touch → predict contact → act
That shift shows up in tasks | @CyberRobooo ✓ | 36.9K | 4.4K | 0.1x | 79 | Apr 20 |