At its IO developer conference on Tuesday, Google showed off some pretty amazing AI killer app contenders.
在周二的 IO 開發(fā)者大會上,谷歌展示了一些非常令人驚嘆的人工智能殺手級應(yīng)用程序競爭者。
These were shared mostly under the umbrella of Project Astra, an experimental Google endeavor at the leading edge of AI models and agents.
這些主要是在 Astra 項目的保護下共享的,這是谷歌在人工智能模型和代理領(lǐng)域前沿的一項實驗性努力。
"To be truly useful, an agent needs to understand and respond to the complex and dynamic world just like people do — and take in and remember what it sees and hears to understand context and take action," Demis Hassabis, CEO of Google DeepMind, said. "It also needs to be proactive, teachable and personal, so users can talk to it naturally and without lag or delay."
谷歌 DeepMind 首席執(zhí)行官 Demis Hassabis 表示:“要真正發(fā)揮作用,智能體需要像人類一樣理解和響應(yīng)復(fù)雜且動態(tài)的世界,并吸收并記住所看到和聽到的內(nèi)容,以了解上下文并采取行動。” 說。 “它還需要主動、可教和個性化,這樣用戶就可以自然地與它交談,沒有滯后或延遲。”
Never forget where you left your glasses again
再也不會忘記你把眼鏡落在哪里了
In a video, Google showed an employee holding up a smartphone with the camera on. She walked through DeepMind's office in London pointing the device at various things and asking questions.
在一段視頻中,谷歌展示了一名員工舉著一部開著攝像頭的智能手機。 她走過 DeepMind 位于倫敦的辦公室,將設(shè)備指向各種物體并提出問題。
The camera at one point showed a speaker and she asked what it was. A Google AI model lurking on the phone (and in the cloud) answered correctly.
鏡頭中有一次出現(xiàn)了一個揚聲器,她問那是什么。 潛伏在手機(和云端)上的谷歌人工智能模型回答正確。
Then she pointed the smartphone at a colleague's computer screen, which had a bunch of software code on it. The AI model correctly told her what that code was for, just by "looking" at the live video feed from the camera.
然后她將智能手機指向同事的電腦屏幕,上面有一堆軟件代碼。 只需“查看”攝像機的實時視頻,人工智能模型就能正確告訴她該代碼的用途。
After a couple more examples, the DeepMind employee asked if the AI agent remembered where she left her glasses. The Google system replied that she'd left them next to an apple on her desk in the office. She walked over there and, lo and behold, there were her glasses by the apple on her desk. The AI agent "remembered" the glasses in the background of previous frames from the phone's live video feed.
又舉了幾個例子后,DeepMind 員工詢問人工智能代理是否記得她把眼鏡放在哪里。 谷歌系統(tǒng)回復(fù)說,她把它們放在辦公室辦公桌上的一個蘋果旁邊。 她走到那里,你瞧,她的眼鏡就放在桌子上的蘋果旁邊。 人工智能代理“記住”了手機實時視頻中先前幀背景中的眼鏡。
Returning shoes退貨鞋子
CEO Sundar Pichai said the company's AI agents can plan and execute multiple tasks — to the point where the bots will be able to return a pair of shoes you ordered online and don't want.
首席執(zhí)行官桑達爾·皮查伊 (Sundar Pichai) 表示,該公司的人工智能代理可以計劃和執(zhí)行多項任務(wù),以至于機器人能夠退回你在網(wǎng)上訂購的一雙并不想要的鞋子。
Calendar entries日歷條目
Google VP Sissie Hsiao showed off another killer application for this new technology.
谷歌副總裁 Sissie Hsiao 展示了這項新技術(shù)的另一個殺手級應(yīng)用。
In this demo, a smartphone camera was pointed at a school flier with details of several upcoming events. The Google AI agent captured all the dates, times, and other details and automatically loaded them into the user's Google Calendar.
在這個演示中,智能手機攝像頭對準了一份學校傳單,其中載有幾項即將舉行的活動的詳細信息。 谷歌人工智能代理捕獲所有日期、時間和其他詳細信息,并自動將它們加載到用戶的谷歌日歷中。
Rental agreements租賃協(xié)議
What if you want to know how a pet might change your apartment rental situation? Do you want to actually read the 12 legal documents you skimmed and signed last year? Of course you don't.
如果您想知道寵物會如何改變您的公寓租賃情況怎么辦? 你想真正讀一下你去年瀏覽并簽署的12份法律文件嗎? 你當然不知道。
You can now drop all these documents into Google's Gemini Advanced AI model and start asking it questions like "If I get a pet, how does this change my rental situation?"
現(xiàn)在,您可以將所有這些文檔放入 Google 的 Gemini Advanced AI 模型中,并開始詢問諸如“如果我養(yǎng)寵物,這將如何改變我的租賃情況?”之類的問題。
Google's AI agent will ingest all the documents quickly and answer your questions by referencing specific parts of the agreements.
谷歌的人工智能代理將快速獲取所有文檔,并通過引用協(xié)議的特定部分來回答您的問題。