Placeholder Image

字幕列表 影片播放

由 AI 自動生成
  • I built an AI server for my daughters.

    我為我的女兒們搭建了一個人工智能服務器。

  • Well, first, it was more for me.

    首先,這更多是為了我自己。

  • I wanted to run all of my AI locally.

    我想在在地運行所有人工智能。

  • And I'm not just talking command line with Olama.

    我說的不僅僅是使用 Olama 的命令行。

  • No, no, no.

    不,不,不

  • We have a GUI, a beautiful chat interface.

    我們有一個圖形用戶界面,一個漂亮的哈拉界面。

  • And this thing's feature-filled.

    這東西功能齊全。

  • It's got RBAC, chat histories, multiple models.

    它有 RBAC、聊天記錄和多種模型。

  • We can even add stable diffusion.

    我們甚至可以增加穩定的擴散。

  • And I was able to add this to my notes application, Obsidian, and have my chat interface right there.

    我可以將其添加到我的筆記應用程序 Obsidian 中,並在那裡設置哈拉界面。

  • I'm gonna show you how to do this.

    我來教你怎麼做。

  • Now, you don't need something crazy like Terry.

    現在,你不需要像特里那樣瘋狂。

  • That's what I named my AI server.

    這就是我給人工智能服務器起的名字。

  • It can be something as simple as this, this laptop.

    就像這檯筆記本電腦一樣簡單。

  • I'll actually demo the entire setup on this laptop.

    實際上,我會在這檯筆記本電腦上演示整個設置。

  • So luckily, the computer you're using right now, the one you're watching this video on, will probably work.

    所以幸運的是,你現在使用的電腦,也就是你正在觀看這段視頻的電腦,很可能可以正常工作。

  • And seriously, you're gonna love this.

    說真的,你一定會喜歡的。

  • It's customizable.

    可定製。

  • It's wicked fast, like way faster than anything else I've used.

    它的速度非常快,比我用過的任何其他產品都要快。

  • Isn't that amazing?

    是不是很神奇?

  • And again, it's local.

    同樣,它也是在地的。

  • It's private.

    它是私人的。

  • I control it, which is important because I'm giving it to my daughters.

    我控制著它,這一點很重要,因為我要把它交給我的女兒們。

  • I want them to be able to use AI to help with school, but I don't want them to cheat or do anything else weird.

    我希望他們能利用人工智能幫助學習,但我不希望他們作弊或做其他奇怪的事情。

  • But because I have control,

    但因為我有控制權、

  • I can put in special model files that restrict what they can do, what they can ask.

    我可以在特殊模型文件中限制他們的行為和要求。

  • And I'll show you how to do that.

    我會告訴你如何做到這一點。

  • So here we go.

    那麼,我們開始吧。

  • We're about to dive in.

    我們即將潛入水中。

  • But first, let me have you meet Terry.

    但首先,讓我給你介紹一下特瑞。

  • Now, Terry has a lot of muscle.

    現在,特里肌肉發達。

  • So for the case, I needed something big.

    是以,我需要一個大箱子。

  • I got the Leon Lee 011 Dynamic Evo XL.

    我買的是 Leon Lee 011 Dynamic Evo XL。

  • It's a full tower E-ATX case.

    這是一款全塔式 E-ATX 機箱。

  • Perfect to hold my Asus X670-E Creator ProArt motherboard.

    非常適合放置我的華碩 X670-E Creator ProArt 主板。

  • This thing's also a beast.

    這東西也是一頭猛獸。

  • I'll put it in the description so you can look at it.

    我會把它寫在描述裡,這樣你就能看到了。

  • Now, I also gave Terry a big brain.

    現在,我還給了特瑞一個大腦袋。

  • He's got the AMD Ryzen 9 7950X.

    他使用的是 AMD Ryzen 9 7950X。

  • That's 4.2 gigahertz and 16 cores.

    這相當於 4.2 千兆赫和 16 個內核。

  • For memory, I went a little crazy.

    為了記憶,我有點瘋狂。

  • I've got 128 gigabytes of the G.Skill Trident Z5 Neo.

    我有 128GB 的 G.Skill Trident Z5 Neo。

  • It's DDR5-6000 and way overkill for what I'm doing.

    它是 DDR5-6000,對我的工作來說太誇張了。

  • I think.

    我想是的

  • I got a Leon Lee water cooler for the CPU.

    我為 CPU 安裝了 Leon Lee 水冷卻器。

  • I'm not sure if I'm saying Leon Lee right.

    我不知道我這樣說李昂對不對。

  • I don't know.

    我不知道。

  • Correct me in the comments.

    請在評論中指正我。

  • You always do.

    你總是這樣。

  • And then for the stuff AI loves, I got two 4090s.

    為了滿足人工智能的需求,我買了兩臺 4090。

  • It's the MSI Supreme and they're liquid cooled so they could fit on my motherboard.

    這是微星至尊版,它們是液冷的,所以可以安裝在我的主板上。

  • 24 gigabytes of memory each, giving me plenty of muscle for my AI models.

    每臺機器的內存為 24 千兆字節,為我的人工智能模型提供了充足的動力。

  • For storage, we got two Samsung 990 Pros, two terabytes, which you can't see because they're behind stuff.

    在存儲方面,我們有兩臺三星 990 Pro,兩 TB,你看不到它們,因為它們在東西后面。

  • And also a Corsair AX1600i power supply.

    還有一個海盜船 AX1600i 電源。

  • 1600 watts to power the entire build.

    1600 瓦的功率可為整個構建提供動力。

  • Terry is ready.

    特里準備好了

  • Now I'm surprised to say my system actually posted on the first attempt, which is amazing.

    現在,我很驚訝地發現,我的系統居然在第一次嘗試時就發佈了,這真是太神奇了。

  • But what's not amazing is the fact that Ubuntu would not install.

    但令人驚訝的是,Ubuntu 無法安裝。

  • I tried for hours, actually for a whole day.

    我試了好幾個小時,實際上試了一整天。

  • And I almost gave up and installed Windows, but I said, no, Chuck, you're installing Linux.

    我差點放棄安裝 Windows,但我說,不,查克,你要安裝 Linux。

  • So I tried something new, something I've never messed with before.

    於是,我嘗試了一些新的東西,一些我以前從未嘗試過的東西。

  • It's called Pop OS by System76.

    這就是 System76 的 Pop OS。

  • This thing is awesome.

    這東西太棒了。

  • It worked the first time.

    第一次就成功了。

  • It even had a special image with NVIDIA drivers built in.

    它甚至還有一個內置英偉達驅動程序的特殊鏡像。

  • It just stinking worked.

    它就是臭氣熏天地工作著。

  • So I sipped some coffee, didn't question the magic and moved on.

    於是,我喝了幾口咖啡,沒有質疑魔法,繼續前行。

  • Now, if you do want to build something similar,

    現在,如果你確實想建造類似的東西、

  • I've got all the links below.

    下面是所有鏈接。

  • But anyways, let's talk about how to build your very own local AI server.

    無論如何,讓我們來談談如何建立自己的在地人工智能服務器。

  • First, what do you need?

    首先,您需要什麼?

  • Really, all you'll need is a computer.

    真的,你只需要一臺電腦。

  • That's it.

    就是這樣。

  • It can be any computer running Windows, Mac or Linux.

    它可以是任何運行 Windows、Mac 或 Linux 的電腦。

  • And if you have a GPU, you'll have a much better time.

    如果你有 GPU,你會玩得更開心。

  • Now, again, I have to emphasize this.

    現在,我必須再次強調這一點。

  • You won't need something as beefy as Terry, but the more powerful your computer is, the better time you'll have.

    您不需要像特里那樣強大的電腦,但您的電腦越強大,您的時間就越充裕。

  • Don't come at me with a Chromebook, please.

    請不要用 Chromebook 來對付我。

  • Now, step one, Olama.

    現在,第一步,奧拉瑪。

  • This is the foundation for all of our AI stuff and what we'll use to run AI models.

    這是我們所有人工智能產品的基礎,也是我們用來運行人工智能模型的基礎。

  • So we'll head on over to olama.ai and click on download.

    是以,我們將前往 olama.ai,點擊下載。

  • And they get a flavor for every OS.

    每個作業系統都有自己的口味。

  • I love that.

    我喜歡這樣。

  • Now, if you're on Mac, just download it right now and run it.

    如果你用的是 Mac,現在就下載並運行它。

  • If you're on Windows, they do have a preview version, but I don't want you to do that.

    如果你使用的是 Windows 系統,他們確實有一個預覽版,但我不希望你這樣做。

  • Instead, I want you to try the Linux version.

    相反,我想讓你試試 Linux 版本。

  • We can install it with one command.

    我們只需一條命令就能安裝它。

  • And yes, you can run Linux on Windows with WSL.

    是的,你可以使用 WSL 在 Windows 上運行 Linux。

  • Let's get that going real quick.

    讓我們快點開始吧。

  • First thing I'll do is go to the start bar and search for terminal.

    我要做的第一件事就是在開始欄中搜索終端。

  • I launched my terminal.

    我啟動了我的終端。

  • Now, this first bit is for Windows folks only,

    現在,第一點僅供 Windows 用戶使用、

  • Linux people, just hang on for a moment.

    Linux 用戶,請稍等片刻。

  • We got to get WSL installed or the Windows subsystem for Linux.

    我們必須安裝 WSL 或 Linux 的 Windows 子系統。

  • It's only one command, WSL dash dash install.

    只有一個命令:WSL dash dash install。

  • And that's it actually.

    實際上就是這樣。

  • Hit enter, and it's gonna start doing some stuff.

    按下回車鍵,它就會開始做一些事情。

  • When it's done, we'll set up a username and password.

    完成後,我們將設置用戶名和密碼。

  • I got a new keyboard, by the way.

    順便說一句,我買了一個新鍵盤。

  • Do you hear that?

    你聽到了嗎?

  • Link below, it's my favorite keyboard in the entire world.

    鏈接如下,這是我在全世界最喜歡的鍵盤。

  • Now, some of you may have to reboot, that's fine.

    現在,有些人可能需要重新啟動,沒關係。

  • Just pause the video and come back.

    暫停視頻,然後再回來。

  • Mine is ready to go though, and we're rocking Ubuntu 22.04, which is still amazing to me that we're running Linux on Windows.

    不過,我的設備已經準備就緒,我們正在使用 Ubuntu 22.04,在 Windows 上運行 Linux 仍然讓我感到不可思議。

  • That's just magic, right?

    這就是魔法,對嗎?

  • Now, we're about to install Olama, but before we do that, you got to do some best practice stuff, like updating our packages.

    現在,我們要安裝 Olama,但在此之前,你得做一些最佳實踐,比如更新我們的套裝軟體。

  • So we'll do a sudo apt update, and then we'll do a sudo apt upgrade dash y to apply all those updates.

    是以,我們將執行 sudo apt update,然後執行 sudo apt upgrade dash y 來應用所有這些更新。

  • And actually, while it's updating, can I tell you something about our sponsor?

    實際上,在更新的同時,我能告訴你一些關於我們贊助商的事情嗎?

  • IT Pro by ACI Learning.

    ACI Learning 的 IT Pro。

  • Now, in this video, we're gonna be doing lots of heavy Linux things.

    現在,在這段視頻中,我們要做很多 Linux 的重頭戲。

  • I'm gonna walk you through it.

    我陪你走過去

  • I'm gonna hold your hand, and you may not really understand what's happening.

    我會握著你的手,你可能真的不明白髮生了什麼。

  • That's where IT Pro comes in.

    這就是 IT Pro 的作用所在。

  • If you want to learn Linux or really anything in IT, they are your go-to.

    如果你想學習 Linux 或 IT 方面的任何知識,他們都是你的首選。

  • That's what I use to learn new stuff.

    我就是用它來學習新知識的。

  • So if you want to learn Linux to get better at this stuff, or you want to start making this whole hobby thing your career, actually learn some skills, get some certifications, get your A+, get your CCNA, get your AWS certifications, your Azure certifications, and go down this crazy IT path, which is incredible, and it's the whole reason I make this channel and make these videos.

    是以,如果你想通過學習 Linux 來更好地掌握這些東西,或者你想把這個愛好變成你的職業,那就學習一些技能,獲得一些認證,獲得 A+、CCNA、AWS 認證、Azure 認證,沿著這條瘋狂的 IT 之路走下去,這太不可思議了,這也是我製作這個頻道和這些視頻的全部原因。

  • Check out IT Pro.

    查看 IT Pro。

  • They've got IT training that won't put you to sleep.

    他們提供的 IT 培訓不會讓你昏昏欲睡。

  • They have labs, they have practice exams, and if you use my code NetworkChuck right now, you'll get 30% off forever.

    他們有實驗室,有模擬考試,如果你現在使用我的代碼 NetworkChuck,你將永遠享受 30% 的折扣。

  • So go learn some Linux, and thank you to IT Pro for sponsoring this video and making things like this possible.

    所以,快去學習 Linux 吧!感謝 IT Pro 贊助本視頻,讓這樣的事情成為可能。

  • And speaking of, my updates are done.

    說到這裡,我的更新已經完成了。

  • And by the way, I will have a guide for this entire thing.

    順便說一句,我會為整個活動準備一份指南。

  • Every step, all the commands, you can find it at the free NetworkChuck Academy membership.

    每個步驟、所有命令,您都可以在免費的網絡查克學院會員資格中找到。

  • Click the link below to join and get some other cool stuff as well.

    點擊下面的鏈接加入,還能獲得其他很酷的東西。

  • I can't wait to see you there.

    我迫不及待地想在那裡見到你。

  • Now we can install Olama with one command.

    現在,我們只需一條命令就能安裝 Olama。

  • And again, all commands are below.

    同樣,所有命令都在下面。

  • Just gonna paste this in.

    我只是想把這個粘貼進去。

  • A nice little curl command, little magic stuff, and this, I love how easy this is, watch.

    一個漂亮的小卷發指令,一個小魔術,還有這個,我喜歡這個,看,多麼簡單。

  • You just sit there and let it happen.

    你只需坐在那裡,順其自然。

  • Do you not feel like a wizard when you're installing stuff like this?

    安裝這樣的東西時,你不覺得自己像個魔法師嗎?

  • And the fact that you're installing AI right now, come on.

    而且你現在就在安裝人工智能,得了吧。

  • Now notice one thing real quick.

    現在請迅速注意一件事。

  • Olama did automatically find out that I have an NVIDIA GPU, and it's like, awesome, you're gonna have a great time.

    Olama 自動發現我有一個英偉達™(NVIDIA®)GPU,然後就說,太棒了,你會玩得很開心的。

  • If it didn't see that and you do have a GPU, you may have to install some NVIDIA CUDA drivers.

    如果它沒有看到,而你確實有 GPU,你可能需要安裝一些英偉達™(NVIDIA®)CUDA 驅動程序。

  • I'll put a link for that below, but not everyone will have to do that.

    我將在下面提供一個鏈接,但不是每個人都必須這樣做。

  • And if you're rocking a Mac with an M1 through M3 chip, you're gonna have a good time too.

    如果你使用的是裝有 M1 至 M3 芯片的 Mac,你也會玩得很開心。

  • They will use the embedded GPU.

    它們將使用嵌入式 GPU。

  • Now at this point, our Mac users, our Linux users, and our Windows users are all converged.

    在這一點上,我們的 Mac 用戶、Linux 用戶和 Windows 用戶都趨於一致。

  • We're on the same path.

    我們走的是同一條路。

  • Welcome, we can hold hands and sing.

    歡迎,我們可以手拉手一起唱歌。

  • It's getting weird.

    越來越奇怪了

  • Anyways, first we have to test a few things to make sure Olama is working.

    總之,我們首先要測試一些東西,以確保 Olama 能正常工作。

  • And for that, we're gonna open our web browser.

    為此,我們要打開網絡瀏覽器。

  • I know, it's kind of weird.

    我知道,這有點奇怪。

  • Just stick with me.

    堅持住

  • I'm gonna launch Chrome here, and here in my address bar, I'm gonna type in localhost, which is looking right here at my computer, and port 11434.

    我要啟動 Chrome 瀏覽器,在地址欄中輸入 localhost,也就是我的電腦,然後輸入 11434 端口。

  • Hit enter, and if you see this right here, this message, you're good to go, and you're about to find this out.

    點擊回車鍵,如果你在這裡看到這條資訊,你就可以開始了,你馬上就會發現這一點。

  • Port 11434 is what Olama's API service is running on, and it's how our other stuff is gonna interact with it.

    11434 端口是 Olama API 服務的運行端口,我們的其他程序也將通過它進行交互。

  • It's so powerful, just check this out.

    它太強大了,看看這個。

  • I'm so excited to show you this.

    我很高興能給你看這個。

  • Now, before we move on, let's go ahead and add an AI model to Olama.

    現在,在繼續之前,我們先為 Olama 添加一個人工智能模型。

  • And we can do that right now with olama pull, and we'll pull down Lama 2, a very popular one.

    我們現在就可以通過 olama pull 來做到這一點,我們將拉下 Lama 2,這是一款非常受歡迎的產品。

  • Hit enter, and it's ready.

    點擊回車,就可以了。

  • Now, let's test it out real quick.

    現在,讓我們來快速測試一下。

  • We'll do olama run Lama 2.

    我們將進行 "喇嘛跑 2"。

  • And if this is your first time doing this, this is kind of magic.

    如果這是你第一次這樣做,這就有點神奇了。

  • We're about to interact with a chat GPT-like AI right here.

    我們將在這裡與一個類似 GPT 的人工智能哈拉互動。

  • No internet required, it's all just happening in that five gigabyte file.

    無需聯網,一切都在這個五千兆字節的文件中發生。

  • Tell me about the solar eclipse.

    跟我說說日食吧

  • Boom, and you can actually control C that to stop it.

    嘭,你其實可以控制 C 來阻止它。

  • Now, I wanna show you this.

    現在,我想給你看這個

  • I'm gonna open up a new window.

    我要打開一個新窗口。

  • This is actually an awesome command.

    這其實是一個很棒的命令。

  • And with this WSL command,

    使用 WSL 命令

  • I'm just connecting to the same instance again, a new window.

    我只是再次連接到同一個實例,一個新窗口。

  • I'm gonna type in watch-n 0.5, not four, five, nvidia-smi.

    我要輸入 watch-n 0.5,而不是 4、5、nvidia-smi。

  • This is going to watch the performance of my GPU right here in the terminal and keep refreshing.

    它將在終端上觀察我的 GPU 性能,並不斷刷新。

  • So keep an eye on this right here as I chat with Lama 2.

    請繼續關注我與喇嘛 2 號的對話。

  • Lama 2, give me a list of all Adam Sandler movies.

    喇嘛 2,給我一份亞當-桑德勒所有電影的清單。

  • And look at that GPU go, ah, it's so fun.

    再看看 GPU,啊,太有趣了。

  • Now, can I show you what Terry does real quick?

    現在,我可以讓你看看特里是怎麼做的嗎?

  • I gotta show you Terry.

    我得讓你看看特瑞

  • Terry has two GPUs.

    特里有兩個 GPU。

  • Here they are right here.

    它們就在這裡。

  • And olama can actually use both of them at the same time.

    實際上,Olama 可以同時使用這兩個功能。

  • Check this out, it's so cool.

    看看這個,太酷了。

  • List all the Samuel L. Jackson movies.

    列出塞繆爾-傑克遜的所有電影。

  • And look at that.

    看看這個。

  • Isn't that amazing?

    是不是很神奇?

  • And look how fast it went.

    看看它走得多快。

  • That's ridiculous.

    這太荒謬了。

  • This is just the beginning.

    這僅僅是個開始。

  • So anyways, I had to show you Terry.

    總之,我得讓你看看特里。

  • So now we have olama installed.

    現在我們已經安裝了 olama。

  • That's just our base.

    這只是我們的基地。

  • Remember, I'm gonna say bye.

    記住,我要說再見了

  • So forward slash bye to end that session.

    是以,向前斜線 "拜拜 "結束了這次會話。

  • Step two is all about the web UI.

    第二步是網絡用戶界面。

  • And this thing is amazing.

    這東西太神奇了

  • It's called OpenWebUI.

    它叫 OpenWebUI。

  • And it's actually one of many web UIs you can get for olama.

    實際上,它是 olama 的眾多網絡用戶界面之一。

  • But I think OpenWebUI is the best.

    但我認為 OpenWebUI 是最好的。

  • Now OpenWebUI will be run inside a Docker container.

    現在,OpenWebUI 將在 Docker 容器中運行。

  • So you will need Docker installed.

    是以,你需要安裝 Docker。

  • And we'll do that right now.

    我們現在就去做。

  • So we'll just copy and paste the commands from Network Shrug Academy.

    是以,我們只需複製並粘貼網絡聳肩學院的命令即可。

  • This is also available on Docker's website.

    這也可以在 Docker 網站上找到。

  • First step is updating our repositories and getting Docker's GPG key.

    第一步是更新我們的軟件源並獲取 Docker 的 GPG 密鑰。

  • And then with one command, we'll install Docker and all its goodies.

    然後只需一條命令,我們就能安裝 Docker 及其所有好東西。

  • Ready, set, go.

    準備,準備,開始

  • Yes, let's do it.

    是的,我們開始吧。

  • And now with Docker installed, we'll use it to deploy our OpenWebUI container.

    現在,Docker 已安裝完畢,我們將用它來部署 OpenWebUI 容器。

  • There'll be one command.

    只有一個命令

  • You can simply copy and paste this.

    您只需複製並粘貼即可。

  • This docker run command is going to pull this image to run this container from OpenWebUI.

    這個 docker run 命令將從 OpenWebUI 中調用這個鏡像來運行這個容器。

  • It's looking at your local computer for the olama base URL, because it's going to integrate and use olama.

    它會在在地計算機上查找 olama 基本 URL,因為它會集成並使用 olama。

  • And it's going to be using the host network adapter to make things nice and easy.

    它將使用主機網絡適配器,使事情變得簡單易行。

  • Keeping in mind, this will use port 8080 on whatever system you're using.

    請注意,無論您使用什麼系統,都將使用 8080 端口。

  • Now all we have to do is hit enter after we add some sudo at the beginning, sudo docker run, and let it do its thing.

    現在,我們要做的就是在開頭添加一些 sudo 後按回車鍵,輸入 sudo docker run,然後讓它開始工作。

  • Let's verify it real quick.

    讓我們快速驗證一下。

  • We'll do a little sudo docker ps.

    我們來做一下 sudo docker ps。

  • We can see that it is indeed running.

    我們可以看到它確實在運行。

  • And now let's go log in.

    現在登錄吧

  • It's kind of exciting.

    這有點令人興奮。

  • Okay, let's go to our web browser and we'll simply type in localhost colon port 8080.

    好的,讓我們打開瀏覽器,輸入 localhost 冒號 8080 端口。

  • Whoa, okay, it's really zoomed in.

    哇,好吧,真的放大了。

  • I'm not sure why.

    我不知道為什麼。

  • You shouldn't do that.

    你不該這麼做

  • Now for the first time you run it, you'll want to click on sign up right here at the bottom and just put your stuff in.

    第一次運行時,您需要點擊底部的註冊,然後輸入您的資訊。

  • This login info is only pertinent to this instance, this local instance.

    這些登錄資訊只與本實例(在地實例)相關。

  • We'll create the account and we're logged in.

    我們將創建賬戶並登錄。

  • Now, just so you know, the first account you log in with or sign up with will automatically become an admin account.

    請注意,您登錄或註冊的第一個賬戶將自動成為管理員賬戶。

  • So right now you as a first time user logging in, you get the power, but look at this.

    是以,現在你作為首次登錄的用戶,就可以使用這些功能,但看看這個。

  • How amazing is this?

    這有多神奇?

  • Let's play with it.

    我們來玩玩吧。

  • So the first thing we have to do is select the model.

    是以,我們首先要做的就是選擇模型。

  • I'll click that dropdown and we should have one, llama2.

    我點擊下拉菜單,應該就有了,llama2。

  • Awesome.

    棒極了

  • And that's how we know also our connection is working.

    這樣我們就能知道我們的連接是否有效。

  • I'll go and select that.

    我去選一下。

  • And by the way, another way to check your connection is by going to your little icon down here at the bottom left and clicking on settings and then connections.

    順便說一下,檢查連接的另一種方法是進入左下角的小圖標,點擊設置,然後點擊連接。

  • And you can see our olamavase URL is right here, if you ever have to change that for whatever reason.

    你可以看到我們的 olamavase URL 就在這裡,如果你因為任何原因需要更改的話。

  • Now with llama2 selected, we can just start chatting.

    現在選擇了 llama2,我們就可以開始哈拉了。

  • And just like that, we have our own little chat GPT.

    就這樣,我們有了自己的哈拉 GPT。

  • That's completely local.

    這是完全本地化的。

  • And this sucker is beautiful and extremely powerful.

    這個吸盤非常漂亮,功能也非常強大。

  • Now, first things, we can download more models.

    首先,我們可以下載更多模型。

  • We can go out to olama and see what they have available.

    我們可以去奧拉瑪看看他們有什麼貨。

  • Click on their models to see their list of models.

    點擊他們的機型,查看他們的機型列表。

  • Code Jemma is a big one.

    傑瑪代碼是個大問題。

  • Let's try that.

    讓我們試試看。

  • So to add Code Jemma, our second model, we'll go back to our command line here and type in olama pull code jemma.

    要添加第二個模型 Code Jemma,我們回到命令行,輸入 olama pull code jemma。

  • Cool, it's done.

    酷,搞定了。

  • Once that's pulled, we can go up here and just change our model by clicking on the little dropdown icon at the top.

    拉出後,我們可以點擊頂部的下拉小圖標,更改模型。

  • Yep, there's Code Jemma.

    沒錯,這就是《珍瑪密碼》。

  • We can switch.

    我們可以交換。

  • And actually, I've never done this before, so I have no idea what's gonna happen.

    事實上,我以前從沒做過這個,所以我不知道會發生什麼。

  • I'm gonna click on my original model, olama2.

    我要點擊我原來的模型,olama2。

  • You can actually add another model to this conversation.

    實際上,你還可以在對話中加入另一種模式。

  • Now we have two here.

    現在我們這裡有兩個。

  • What's gonna happen?

    會發生什麼?

  • So Code Jemma is answering it first.

    所以《珍瑪密碼》先回答。

  • I'm actually not sure what that does.

    其實我也不知道這有什麼用。

  • Maybe you guys can try it out and tell me.

    也許你們可以試一試,然後告訴我。

  • I'm gonna move on though.

    不過我要繼續前進了。

  • Now some of the crazy stuff.

    現在是一些瘋狂的事情。

  • You can see right here.

    您可以在這裡看到。

  • It's almost more featured than ChatGPT in some ways.

    在某些方面,它幾乎比 ChatGPT 更具特色。

  • You got a bunch of options for editing your responses, copying, liking and disliking it to help it learn.

    您有很多編輯回覆、複製、喜歡和不喜歡的選項,以幫助它學習。

  • You can also have it read things out to you, continue response, regenerate response, or even just add stuff with your own voice.

    您還可以讓它為您朗讀、繼續迴應、再生迴應,甚至用您自己的聲音添加內容。

  • I can also go down here, and this is crazy.

    我還可以從這裡下去,這太瘋狂了。

  • I can mention another model, and it's gonna respond to this and think about it.

    我可以提到另一個模型,它也會對此做出反應並加以思考。

  • Did you see that?

    你看到了嗎?

  • I just had my other model talk to my current, like, that's just weird, right?

    我剛剛讓我的另一個模特和我現在的模特說話,這很奇怪,對吧?

  • Let's try to make them have a conversation.

    讓我們試著讓他們進行對話。

  • Like, they're gonna have a conversation.

    就像,他們會有一個對話。

  • What are they gonna talk about?

    他們要談什麼?

  • Let's bring back in olama2 to ask the question.

    讓我們請回 olama2 來提問。

  • This is hilarious.

    太搞笑了

  • I love this so much.

    我太喜歡這個了。

  • Okay, anyways, I could spend all day doing this.

    好吧,不管怎麼說,我可以花一整天的時間來做這件事。

  • We can also, with this plus sign, upload files.

    我們還可以用這個加號上傳文件。

  • This includes a lot of things.

    這包括很多方面。

  • Let's try, do I have any documents here?

    讓我們試試,我這裡有什麼文件嗎?

  • I'll just copy and paste the contents of an article.

    我只是複製和粘貼一篇文章的內容。

  • Save that, and that'll be our file.

    保存,這就是我們的文件。

  • Summarize this.

    總結一下。

  • You can see our GPU being used over here.

    您可以在這裡看到我們 GPU 的使用情況。

  • I love that so much.

    我非常喜歡。

  • You're running locally.

    你在在地運行。

  • Cool.

    酷斃了

  • We can also add pictures for multimodal models.

    我們還可以為多模態模型添加圖片。

  • I'm not sure Code Jemma can do that.

    我不確定珍瑪法典能做到這一點。

  • Let's try it out real quick.

    讓我們快速嘗試一下。

  • So olama can't do it, but there is a multimodal model called lava.

    是以,olama 無法做到這一點,但有一種名為 lava 的多模式模型。

  • Let's pull that down real quick.

    讓我們把它快速拉下來。

  • With lava pulled, let's go to our browser here once more.

    拉出熔岩後,讓我們再次進入瀏覽器。

  • We'll refresh it, change our model to lava, add the image.

    我們將刷新它,將模型更改為熔岩,然後添加圖片。

  • That's really scary.

    這真的很可怕。

  • That's pretty cool.

    太酷了

  • Now, here in a moment,

    現在,馬上就來、

  • I will show you how we can generate images right here in this web interface by using Stable Diffusion.

    我將向大家展示如何在此網頁界面中使用穩定擴散技術生成影像。

  • But first, let's play around a bit more.

    但首先,讓我們再玩一玩。

  • And actually, the first place I wanna go to is the admin panel for you, the admin.

    實際上,我想去的第一個地方是你的管理面板,也就是管理員。

  • We have one user, and if we click on the top right, we have admin settings.

    我們有一個用戶,如果點擊右上角,就可以看到管理員設置。

  • Here's where a ton of power comes in.

    這就是強大動力的來源。

  • First, we can restrict people from signing up.

    首先,我們可以限制人們註冊。

  • We can say enabled or disabled.

    我們可以說啟用或禁用。

  • Now, right now, by default, it's enabled.

    現在,默認情況下是啟用的。

  • That's perfect.

    太完美了

  • And when they try to sign up initially, they'll be a pending user until you're approved.

    當他們嘗試註冊時,他們將成為待註冊用戶,直到你準許為止。

  • Let me show you.

    讓我演示給你看

  • So now, real quick.

    所以,現在,真正的快。

  • If you wanna have someone else use this server on your laptop or computer or whatever it is, they can access it from anywhere as long as they have your IP address.

    如果你想讓別人在你的筆記本電腦或電腦或其他設備上使用這個服務器,只要他們有你的 IP 地址,他們就可以從任何地方訪問它。

  • So let me do a new user signup real quick just to show you.

    讓我快速註冊一個新用戶,向大家展示一下。

  • I'll open an incognito window.

    我會打開一個隱身窗口。

  • Create account.

    創建賬戶。

  • And look, it's saying, hey, you gotta wait.

    你看,它在說,嘿,你得等等。

  • Your guy has to approve you.

    你的男人必須同意你。

  • And if we go here and refresh our page, on the dashboard, there is Bernard Hackwell.

    如果我們在這裡刷新頁面,在儀表盤上就會看到伯納德-哈克韋爾。

  • We can say, you know what?

    我們可以說,你知道嗎?

  • He's a user.

    他是一個用戶。

  • Or click it again.

    或者再次點擊。

  • He's an admin.

    他是管理員

  • No, no, he's not.

    不,不,他沒有。

  • He's gonna be a user.

    他會成為一個用戶。

  • And if we check again, boom, we have access.

    如果我們再次檢查,"砰 "的一聲,我們就能進入了。

  • Now, what's really cool is if I go to admin settings and I go to users, I can say, hey, you know what?

    現在,最酷的是,如果我進入管理員設置,然後進入用戶,我就可以說,嘿,你知道嗎?

  • Don't allow chat deletion, which is good if I'm trying to monitor what my daughters are kind of up to on their chats.

    不允許刪除聊天記錄,如果我想監控女兒們在聊天記錄上的內容,這一點很好。

  • I can also whitelist models.

    我還可以將模型列入白名單。

  • Like, so you know what?

    你知道嗎?

  • They're only allowed to use Llamatu.

    他們只能使用拉馬圖。

  • And that's it.

    就是這樣。

  • So when I get back to Bernard Hackwell's session over here,

    等我回到伯納德-哈克韋爾的會議上 So when I get back to Bernard Hackwell's session over here、

  • I should only have access to Llamatu.

    我應該只能訪問拉馬圖。

  • It's pretty sick.

    太變態了

  • And it becomes even better when you can make your own models that are restricted.

    如果您能自己製作受限制的模型,效果會更好。

  • We're gonna mosey on over to the section called model files right up here.

    我們將在這裡的 "模型文件 "部分進行操作。

  • And we'll click on create a model file.

    然後點擊創建模型文件。

  • Now, you can also go to the community and see what people have created.

    現在,你還可以進入社區,看看大家都創造了什麼。

  • That's pretty cool.

    太酷了

  • I'm gonna show you what I've done for my daughter, Chloe, to prevent her from cheating.

    我要向你們展示我為我女兒克洛伊所做的一切,以防止她出軌。

  • She named her assistant Debra.

    她給自己的助手取名為黛布拉。

  • And here's the content.

    內容如下

  • I'm gonna paste it in right now.

    我現在就貼上去。

  • The main thing is up here where it says from, and you choose your model, so from Llamatu.

    最主要的是上面寫著從哪裡來,你可以選擇你的型號,所以是從拉馬圖來。

  • And then you have your system prompt, which is gonna be between three double quotes.

    然後是系統提示符,位於三個雙引號之間。

  • And I've got all this, telling it what it can and can't do, what Chloe's allowed to ask.

    而我得到了這一切,告訴它能做什麼,不能做什麼,克洛伊可以問什麼。

  • And it ends down here with three double quotes.

    最後用三個雙引號結束。

  • You can do a few more things.

    你還可以做一些事情。

  • I'm just gonna say as an assistant, education, save and create.

    作為助手,我只想說,教育、儲蓄和創造。

  • Then I'll go over to my settings once more and make sure that for the users, this model is whitelisted.

    然後,我將再次進入我的設置,確保用戶將此模型列入白名單。

  • I'll add one more, Debra.

    我再補充一句,黛布拉。

  • Notice she has an option now.

    請注意,她現在有了一個選擇。

  • And if Bernard's gonna try and use Debra and say,

    如果伯納德想利用黛博拉 說:

  • Debra, paper for me on the Civil War.

    黛布拉,幫我寫一份關於南北戰爭的論文。

  • And immediately I will shut down saying, hey, that's cheating.

    我馬上就會說,嘿,那是作弊。

  • Now, Llamatu, the model we're using, it's okay.

    現在,拉馬圖,我們正在使用的模型,還不錯。

  • There's a better one called Mixedrel.

    有一個更好的叫 Mixedrel。

  • Let me show you Terry.

    讓我帶你看看特里。

  • I'll use Debra or Deb and say, write me a paper.

    我會用 Debra 或 Deb,然後說,給我寫一篇論文。

  • I'm Benjamin Franklin.

    我是本傑明-富蘭克林

  • And notice how it didn't write it for me, but it says it's gonna guide me.

    請注意,它不是為我寫的,而是說它會引導我。

  • And that's what I told it to do, to be a guide.

    這就是我讓它做的,做一個嚮導。

  • I tried to push it and it said no.

    我試著按了一下,它說不行。

  • So that's pretty cool.

    所以這很酷。

  • You can customize these prompts, put in some guardrails for people that don't need full access to the kind of stuff right now.

    你可以定製這些提示,為現在不需要完全訪問這些內容的人設置一些防護措施。

  • I think it's awesome.

    我覺得這很棒。

  • Now, OpenWebUI does have a few more bells and whistles, but I wanna move on to getting stable diffusion set up because this thing is so cool and powerful.

    現在,OpenWebUI 確實有一些更多的功能和口哨聲,但我想繼續設置穩定擴散,因為這東西太酷、太強大了。

  • Step three, stable diffusion.

    第三步,穩定擴散。

  • I didn't think that image generation locally would be as fun or as powerful as chat GPT, but it's more.

    我本以為在地生成影像不會像哈拉 GPT 那樣有趣或強大,但事實卻並非如此。

  • Like, it's crazy.

    太瘋狂了

  • You gotta see it.

    你一定要看看。

  • Now, we'll be installing stable diffusion with a UI called Automatic 1111.

    現在,我們將使用名為 Automatic 1111 的用戶界面安裝穩定擴散。

  • So let's knock it out.

    那我們就開始吧

  • Now, before we install it, we got some prereqs.

    現在,在安裝之前,我們有一些前提條件。

  • And one of them is an amazing tool I've been using a lot called PyENV, which helps us manage our Python versions and switch between them, which is normally such a pain.

    其中一個工具是我經常使用的 PyENV,它可以幫助我們管理 Python 版本並在它們之間切換,而這通常是一件非常麻煩的事情。

  • Anyways, the first thing we gotta do is make sure we have a bunch of prereqs that's installed.

    總之,我們要做的第一件事就是確保我們已經安裝了大量的先決條件。

  • Go ahead and copy and paste this from the Network Check Academy.

    繼續複製並粘貼網絡檢查學院的內容。

  • Let it do its thing for a bit.

    讓它做一會兒。

  • And with the prereqs installed, we'll copy and paste this command, a curl command that'll automatically do everything for us.

    安裝了先決條件後,我們將複製並粘貼這條命令,它是一條 curl 命令,會自動為我們完成所有工作。

  • I love it.

    我喜歡

  • Run that.

    運行它。

  • And then right here, it tells us we need to add all this or just run this command to put this in our bashrc file so we can actually use the PyENV command.

    然後在這裡,它會告訴我們需要添加這些內容,或者運行這條命令,把這些內容放到我們的 bashrc 文件中,這樣我們就可以使用 PyENV 命令了。

  • I'll just copy this, paste it, and we'll type in source.bashrc to refresh our terminal.

    我複製並粘貼它,然後輸入 source.bashrc 刷新終端。

  • And let's see if PyENV works.

    讓我們看看 PyENV 是否能正常工作。

  • PyENV, we'll do a dash H to see if it's up and running.

    PyENV,我們將做一個破折號 H 來查看它是否啟動並運行。

  • Perfect.

    太完美了

  • Now let's make sure we have a version of Python installed that will work for most of our stuff.

    現在,讓我們確保已安裝的 Python 版本能滿足大部分需求。

  • We'll do PyENV install 3.10.

    我們將安裝 PyENV 3.10。

  • This will, of course, install Python 3.10, the latest version.

    當然,這將安裝最新版本的 Python 3.10。

  • Excellent, Python 3.10 is installed.

    太好了,Python 3.10 已安裝。

  • We'll make it our global Python by typing in PyENV, global 3.10.

    我們將通過輸入 PyENV, global 3.10 使其成為我們的全局 Python。

  • Perfect.

    太完美了

  • And now we're gonna install automatic 1.1.1.1.

    現在,我們要安裝自動 1.1.1.1。

  • The first thing we'll do is make a new directory, mkdir for make directory.

    我們要做的第一件事就是新建一個目錄,mkdir 表示 make directory。

  • We'll call it stable diff.

    我們稱之為穩定差異。

  • Then we'll jump in there, cd stable diff.

    然後我們就跳進去,cd 穩定衍射。

  • And then we'll use this wget command to wget this bash script.

    然後使用 wget 命令獲取這個 bash 腳本。

  • We'll type in ls to make sure it's there.

    我們將輸入 ls 以確保它在那裡。

  • There it is.

    就在那兒。

  • Let's go ahead and make that sucker executable by typing in chmod.

    讓我們繼續輸入 chmod 命令,讓這個小東西成為可執行文件。

  • We'll do a plus x and then webui.sh.

    我們先做一個加 X,然後再做 webui.sh。

  • Now it's executable.

    現在可以執行了。

  • Now we can run it.

    現在我們可以運行它了。

  • Period forward slash webui.sh.

    期間向前斜線 webui.sh。

  • Ready, set, go.

    準備,準備,開始

  • This is gonna do a lot of stuff.

    它能做很多事

  • It's gonna install everything you need for open web UI.

    它會安裝開放式網絡用戶界面所需的一切。

  • It's gonna install PyTorch and download stable diffusion.

    它會安裝 PyTorch 並下載穩定的 diffusion。

  • It's awesome.

    太棒了

  • Again, a little coffee break.

    再來,喝杯咖啡休息一下。

  • Okay, that took a minute.

    好的,花了一分鐘。

  • A long time.

    很長時間

  • I hope you got plenty of coffee.

    我希望你有足夠的咖啡。

  • Now it might not seem like it's ready, but it actually is running.

    現在它可能看起來還沒準備好,但實際上已經在運行了。

  • And you'll see the URL pop up like around here.

    然後你就會看到 URL 彈出,就像在這裡一樣。

  • It's kind of messed up.

    這有點亂。

  • But it's running on port 7860.

    但它運行的端口是 7860。

  • Let's try it out.

    讓我們試一試。

  • And this is gonna, this is fun.

    這會很有趣

  • Oh my gosh.

    我的天啊

  • So localhost 7860.

    所以是 localhost 7860。

  • What you're seeing here is hard to explain.

    你在這裡看到的情況很難解釋。

  • Let me just show you.

    讓我演示給你看

  • And let's generate.

    讓我們生成。

  • Okay, it got confused.

    好吧,它搞混了。

  • Let me take away the oompa loompas part.

    讓我把 Oompa loompas 部分去掉。

  • But this isn't being sped up.

    但這並沒有加快速度。

  • This is how fast this is.

    就是這麼快。

  • No, that's a little terrible.

    不,這有點糟糕。

  • What do you say we make it look a little bit better?

    我們把它弄得好看一點,怎麼樣?

  • Okay, that's terrifying.

    好吧,這太可怕了。

  • But just one of the many things you can do with your own AI.

    但這只是您可以用自己的人工智能做的眾多事情之一。

  • Now you can actually download other models.

    現在,您可以下載其他模型了。

  • Let me show you what it looks like on Terry and my new editor, Mike, telling me to do this.

    讓我給你們看看特里和我的新編輯邁克告訴我這樣做的樣子。

  • That's weird.

    真奇怪

  • Let's make it take more time.

    讓我們多花點時間吧。

  • But look how fast this is.

    但你看這速度多快。

  • Like it's happening in real time as I'm talking to you right now.

    就像我現在跟你說話時正在發生的一樣。

  • But if you've ever made images with GPT-4, it just takes forever.

    但如果你用 GPT-4 製作過影像,就會發現它需要很長時間。

  • But I just love the fact that this is running on my own hardware.

    但我就是喜歡它能在我自己的硬件上運行。

  • And it's kind of powerful.

    它很強大。

  • Let me know in the comments below which is your favorite image.

    請在下方評論中告訴我你最喜歡哪張圖片。

  • Actually post on Twitter and tag me.

    其實是在 Twitter 上發佈並標記我。

  • This is awesome.

    這太棒了。

  • Now this won't be a deep dive on stable diffusion.

    現在,我們將不再深入探討穩定擴散問題。

  • I barely know what I'm doing.

    我幾乎不知道自己在做什麼。

  • But let me show you real quick how you can easily integrate automatic 11111.

    不過,讓我來向大家演示一下如何輕鬆整合 11111 自動系統。

  • Did I do enough ones?

    我做的夠多嗎?

  • I'm not sure.

    我不確定。

  • And their stable diffusion inside OpenWebUI.

    並在 OpenWebUI 內部穩定傳播。

  • So it's just right here.

    所以它就在這裡。

  • Back at OpenWebUI, if we go down to our little settings here and go to settings, you'll see an option for images.

    回到 OpenWebUI,如果我們點擊這裡的小設置,進入設置,你會看到一個影像選項。

  • Here we can put our automatic 11111 base URL, which will simply be HTTP colon whack whack 127.0.0.1, which is the same as saying localhost port 78, was it 06, 60?

    在這裡,我們可以輸入自動 11111 基本 URL,簡單地說,就是 HTTP 冒號捶捶捶 127.0.0.1,相當於在地主機端口 78,是 06 還是 60?

  • 60, I think that's what it is.

    60,我想就是這樣。

  • We'll hit the refresh option over here to make sure it works.

    我們點擊這裡的刷新選項,以確保它能正常工作。

  • And actually, no, it didn't.

    事實上,並沒有。

  • And here's why.

    原因就在這裡。

  • There's one more thing you gotta know.

    還有一件事你必須知道。

  • Here we have OpenWebUI running in our terminal.

    這裡,OpenWebUI 正在我們的終端中運行。

  • Then I control C, it's gonna stop it from running.

    然後我控制 C,它就會停止運行。

  • In order to make it work with OpenWebUI, we gotta use two switches to make it work.

    為了讓它與 OpenWebUI 配合使用,我們必須使用兩個開關。

  • So let's go ahead and run our script one more time.

    讓我們再運行一次腳本。

  • OpenWebUI or webui.sh.

    OpenWebUI 或 webui.sh。

  • And we'll do dash dash listen and dash dash API.

    我們將進行破折號破折號監聽和破折號破折號應用程序接口。

  • Once we see the URL come up, okay, cool, it's running.

    一旦我們看到 URL 出現,好的,很酷,它正在運行。

  • We can go back over here and say, why don't you try that again, buddy?

    我們可以回到這裡說,你為什麼不再試一次呢,夥計?

  • Perfect.

    太完美了

  • And then over here we have image generation experimental.

    然後,我們在這裡進行影像生成實驗。

  • They're still trying it out.

    他們還在試用。

  • We'll say on and we'll say save.

    我們會說 "繼續",也會說 "保存"。

  • So now if we go to any prompt, let's do a new chat.

    所以,現在如果我們進入任何提示,讓我們做一個新的哈拉。

  • We'll chat with Llama2.

    我們會和 Llama2 哈拉。

  • I'll say describe a man in a dog suit.

    我會描述一個穿著狗衣服的人。

  • This is for a stable diffusion prompt.

    這是穩定的擴散提示。

  • A bit wordy for my taste, but then notice we have a new icon.

    我覺得有點囉嗦,但注意到我們有了一個新圖標。

  • This is so neat.

    真漂亮

  • Boom, an image icon.

    嘭,一個影像圖標。

  • And all we have to do is click on that to generate an image based on that prompt.

    我們要做的就是點擊它,根據提示生成影像。

  • I clicked on it, it's doing it.

    我點擊了它,它正在做。

  • And there it is, right in line.

    它就在那裡,排成一行。

  • That is so cool.

    太酷了

  • And that's really terrifying.

    這真的很可怕。

  • I love this, it's so fun.

    我喜歡這個,太有趣了。

  • Now this video is getting way too long, but there are still two more things I wanna show you.

    現在這段視頻已經太長了,但我還想給你們看兩樣東西。

  • I'm gonna do that really quickly right now.

    我現在就去做。

  • The first one is, it's just magic.

    第一個是,它就是魔法。

  • Check it out.

    看看吧

  • There's another option here inside OpenWebUI.

    在 OpenWebUI 中還有另一個選項。

  • A little section right here called documents.

    這裡有一小部分叫做文件。

  • Here we can simply just add a document.

    在這裡,我們只需添加一個文件即可。

  • I'll add that one from before.

    我再加上之前的那個。

  • It's there, available for us.

    它就在那裡,可供我們使用。

  • And now when we have a new chat,

    現在我們有了新的聊天工具、

  • I'll chat with Code Jemma.

    我去和 "珍瑪密碼 "聊聊。

  • All I have to do is do a hashtag and say, let's talk about this.

    我所要做的就是做一個標籤,然後說,讓我們來談談這個。

  • And say, give me five bullet points about this.

    然後說,給我五個關於這個問題的要點。

  • Cool.

    酷斃了

  • Give me three social media posts.

    給我三個社交媒體帖子。

  • Okay, Code Jemma, let me try it again.

    好吧,珍瑪密碼,讓我再試一次。

  • What just happened?

    剛才發生了什麼?

  • Yeah, let's do a new prompt.

    是啊,我們來做一個新的提示。

  • Oh, there we go.

    哦,又來了。

  • And I'm just scratching the surface.

    而我只是淺嘗輒止。

  • Now, second thing I wanna show you.

    現在,我想給你們看第二件事

  • Last thing.

    最後一件事

  • I am a huge Obsidian nerd.

    我是黑曜石的超級書呆子。

  • It's my notes application.

    這是我的筆記應用程序。

  • It's what I use for everything.

    我什麼都用它。

  • It's been very recent.

    這是最近的事。

  • I haven't made a video about it, but I plan to.

    我還沒有製作過相關視頻,但我打算這樣做。

  • But one of the cool things about this, this very local private notes taking application, is that you can add your own local GPT to it, like what we just deployed.

    不過,這個非常本地化的私人筆記應用程序的一個亮點是,你可以在其中添加自己的在地 GPT,就像我們剛剛部署的那樣。

  • Check this out.

    看看這個

  • I'm gonna go to settings.

    我要去設置。

  • I'll go to community plugins.

    我去看社區插件。

  • I'll browse for one.

    我會瀏覽一個。

  • I'm gonna search for one called BMO, BMO chatbot.

    我要搜索一個叫 BMO 的,BMO 哈拉機器人。

  • I'm gonna install that, enable it.

    我要安裝它,啟用它。

  • And then I'm gonna go to settings of BMO chatbot.

    然後我要進入 BMO 哈拉機器人的設置。

  • And right here, I can have an Olama connection, which is gonna connect to, let's say Terry.

    在這裡,我可以建立一個 Olama 連接,它將連接到 Terry。

  • So I'll connect him to Terry.

    所以我要把他和特瑞聯繫起來。

  • And I'll choose my model.

    我會選擇我的模型。

  • I'll use Olamatu, why not?

    我要用奧拉馬圖,為什麼不呢?

  • And now, right here in my note,

    而現在,就在我的筆記裡、

  • I can have a chatbot come right over here to the side and say like, hey, how's it going?

    我可以讓一個哈拉機器人到這邊來,然後說,嘿,怎麼樣了?

  • And I can do things like look at the help file, see what I can use here.

    我可以做一些事情,比如查看幫助文件,看看我能在這裡用到什麼。

  • Ooh, turn on reference.

    哦,打開參考資料。

  • So I'm gonna say reference on.

    所以我要說,繼續參考。

  • It's now gonna reference the current note I'm in.

    它現在會引用我現在所處的音符。

  • Tell me about the system prompt.

    說說系統提示。

  • Yep, there it is.

    沒錯,就是它。

  • And it's actually going through and telling me about the note I'm in.

    實際上,它正在查看並告訴我我所處的音符。

  • So I have a chatbot right there, always available for me to ask questions about what I'm doing.

    是以,我的哈拉機器人就在那裡,隨時都可以向我提出有關我正在做的事情的問題。

  • I can even go in here and go highlight this, do a little prompts like generate.

    我甚至可以在這裡突出顯示這一點,做一些提示,比如生成。

  • It's generating right now.

    它正在生成。

  • And it's generating some stuff for me.

    它為我帶來了一些東西。

  • I'm gonna undo that.

    我要把它撤掉

  • Let me do another note.

    讓我再做一個說明。

  • So I wanna tell a story about a man in a dog suit.

    所以我想講一個穿著狗衣服的人的故事。

  • I'll quickly talk to my chatbot and start to do some stuff.

    我很快就會和我的哈拉機器人對話,並開始做一些事情。

  • I mean, that's pretty crazy.

    我是說,這太瘋狂了。

  • And this I think for me is just scratching the surface of running local AI private in your home on your own hardware.

    對我來說,這還只是在家裡用自己的硬件運行在地人工智能的皮毛。

  • This is seriously so powerful and I can't wait to do more stuff with this.

    這真的太強大了,我迫不及待地想用它做更多的事情。

  • Now I would love to hear what you've done with your own projects.

    現在,我很想聽聽你們在自己的項目中都做了些什麼。

  • If you attempted this, if you have this running in your lab, let me know in the comments below.

    如果你嘗試過,如果你在實驗室裡運行過,請在下面的評論中告訴我。

  • Also, do you know of any other cool projects I can try that I can make a video about?

    此外,您知道我還可以嘗試其他很酷的項目,並將其製作成視頻嗎?

  • I will love to hear that.

    我很樂意聽到這個消息。

  • I think AI is just the coolest thing.

    我認為人工智能是最酷的東西。

  • But also privacy is a big concern for me.

    但隱私也是我非常關心的問題。

  • So to be able to run AI locally and play with it this way is just the best thing ever.

    是以,能夠在在地運行人工智能並以這種方式進行遊戲是最棒的事情。

  • Anyways, that's all I got.

    總之,我就知道這麼多了。

  • If you wanna continue the conversation and talk more about this, please check out our Discord community.

    如果您想繼續討論,請訪問我們的 Discord 社區。

  • The best way to join that is to jump through our Network Chuck Academy membership, the free one.

    加入網絡查克學院的最佳方式是成為我們的免費會員。

  • And if you do wanna join the paid version, we do have some extra stuff for you there too.

    如果你想加入付費版,我們也會為你提供一些額外的服務。

  • And I'll help support what we do here.

    我會幫助支持我們在這裡所做的一切。

  • But I'd love to hang out with you and talk more.

    不過,我很想和你多聊聊。

  • That's all I got.

    我就知道這麼多。

  • I'll catch you guys next time.

    下次再找你們。

I built an AI server for my daughters.

我為我的女兒們搭建了一個人工智能服務器。

字幕與單字
由 AI 自動生成

單字即點即查 點擊單字可以查詢單字解釋