I know that is a huge statement, but as someone that has actually used the MetaGlasses, if my MetaGlasses had Gemini AI built inside of them and I was able to talk and reason with regards to what I am currently seeing on a day-to-day basis, I'm not sure I would ever not use them because many of the times during the day, we have to switch over from one tab to use chat GPT, but how easier would it be if you just had some glasses on your head and you could just easily ask them, hey, on my screen, yada, yada, yada, or hey, I've got this in my hand, what do I do X, Y, Z?
我知道這是一個很大的聲明,但作為一個實際使用過 MetaGlasses 的人,如果我的 MetaGlasses 內建了雙子座人工智能,並且我能夠就我目前每天看到的內容進行對話和推理,我不確定我是否會不使用它們,因為在一天中的很多時候、但如果你頭上戴著一副眼鏡,就可以輕鬆地詢問它們:"嘿,我的螢幕上,呀,呀,呀,或者嘿,我手裡拿著這個,我該怎麼做 X、Y、Z?