Can this robot think like a human? This was the main question that puzzled me all weekend. Figure A
Can this robot think like a human? This was the main question that puzzled me all weekend. Figure AI's new demo is truly mind-blowing. I was super surprised by the newly acquired skills of the Figure 01 robot and made me think that we are not far away from robots that think like humans. We'll comment on the very strange things in the demo, and then take a look at the technical explanations from Figure AI executives. Finally, we will examine what all this means for being human greater.
In this demo, it can be seen that Figure 01 easily solves many things that are very easy for humans and very difficult for robots. The first one is his answer to the question "What do you see on the table?" He sees it all, understands it, interprets it and answers it in a very neat sentence. It even captures the details and even says that the person's hand is on the table. Then he realizes that the person who says he is hungry wants to eat from him. He looks at the table, realizes that the edible thing there is an apple, and offers it to the other party. Then the human asks why did you give me the apple, and while asking this question, he also gives an extra task that will confuse the robot. He is asked to sort the garbage from the table. It gives an incredibly logical answer to the question. He says the apple was the only edible thing on the table, so I gave it to him. On the other hand, he collects the garbage.
Then comes a planning question. What to do with the dishes on the table? He says he needs to place them in the dish rack and he does so successfully. He even turns the glass upside down and puts it in the dishwasher, just like a human. At the end of all this, when asked how he found his performance, he evaluates himself and says: the food is with the right person, the dishes are in the right place, I think I did everything right. Here, too, he makes a self-evaluation and can interpret whether his actions lead to the desired result. All this is truly incredible. The robot can see, hear, communicate via chat gpt, make sentences, make comments, and use its hands with super precision. We call this dexterity. He spins the glass correctly. He serves the apple the right way. His hands work very well, just like his intelligence.
I have already mentioned Figure AI, the manufacturer of this robot. Figure AI recently received an investment of $650 million at a valuation of $2.6 billion. He received investments from Nvidia, Open AI and Jeff Bezos. A robot manufacturing company that seems to grow very quickly. What do the company's executives say about these new skills of Figure 01? There are many interesting details there too. Figure AI's chief technology officer, Corey Lynch, says that we have created a robot that can interpret and describe what it sees around it. He can make plans for the future. He has memories and can make interpretations from these memories and verbally convey his train of thought to the other party. So he can think practically. There is an important emphasis here.
Everything you saw in the demo, the robot learned itself. We do not make a transfer to him at that moment. There is no programming. Everything works in real time. The robot sees its surroundings through cameras. He listens through microphones. It also works with Open AI's chat gpt to interpret and develop answers. They explain a little bit about how the model works on their Twitter page. The common model with Open AI is how Neural Network works so that the robot can learn on its own. They also pay great attention to the fact that we produced all the physical equipment of this robot. For this reason, physical equipment and software artificial intelligence are in full cooperation.
Figure 01's ability to see and hear its surroundings and combining this with Open AI has given it many new features. Like understanding his surroundings, developing basic trains of thought and understanding why he made a decision, understanding and interpreting extremely high-level words that are left in the air, such as I am hungry, and taking a task from there and giving him an apple, and explaining why he does everything in very clear English. These were the aspects of the robot that really amazed me in the demo, such as its ability to explain. It can interpret the images around it at a speed of 10 Hz. So let's say it makes 10 updates per second and interprets them at a speed of 200 Hz to turn them into physical movements. That means it makes 200 movements per second. In this way, the robot's movements flow very smoothly. It doesn't have the sharp staccato movements we've seen from previous robot demos.
So what does all this mean? First, Tesla needs to step up its pace. Because I think the last Optimus demo was a demo far behind this one. Here, Figure AI's collaboration with Open AI has taken it to great places, with Nvidia's huge processing power behind it. This is the technological aspect of the matter, but more importantly, where is humanity heading? Because obviously these robots will participate in daily life much faster than we think. Because before, when we talked about robots, we were talking about cumbersome machines with giant arms that we had to program. This means he can see his surroundings, listen, combine this with a language model, and think. He can explain what he thinks and why. All of these work together.
Of course, you may immediately think of worrying questions such as whether these can be used in the army or as soldiers. For now, at least, the company's statement is that we will never tolerate violence. "You will not see our robot on the battlefields," he says. But it is very difficult to know, one thing I know for sure is that robots are joining our lives faster than we think and humanoids will join our lives sooner than we think.
The information, comments and recommendations contained herein are not within the scope of investment consultancy. Investment consultancy services are provided within the framework of the investment consultancy agreement to be signed between brokerage firms, portfolio management companies, banks that do not accept deposits and customers. The comments in this article are only my personal comments and these comments may not be appropriate for your financial situation and risk return. For this reason, investments should not be made based on the information and comments in my articles.