Placeholder Image

字幕列表 影片播放

  • get a good get getting by.

  • The sound of the collapse sounds very good.

  • Judging by the sound of the collapse, it sounds very good.

  • Did you hear that?

  • The me I know it was the guy backstage.

  • Do it again.

  • Do it again.

  • I just wanted applause.

  • Huh?

  • You will.

  • Five of you are.

  • Well, what's the matter?

  • Was the last.

  • The last stop was good, right?

  • You're you're feeling energized.

  • You feeling inspired?

  • Amazing.

  • My name is Tae GIs.

  • That's pronounced like contagious.

  • But don't worry, I'm not contagious.

  • I'm not going to give you anything.

  • Besides, hopefully a good talk.

  • Um, and I have to say, I love this conference.

  • I was here for C.

  • S S Khan and James Caan yesterday Learned a ton.

  • I learned a ton debugging performance.

  • It's all here.

  • Can we hear it?

  • For the organizer's of Js Conti Budapest.

  • Absolutely incredible.

  • And also and also the city.

  • The city is really beautiful.

  • If you live here who I'm jealous.

  • I'm jealous.

  • Um, so as I said, my name stages.

  • Contagion is kind of like a way to remember it, because it's a It's a bit of a different Indian name, but really?

  • At this point, you can just call me that joke guy from yesterday.

  • Ah, yesterday had the party.

  • It's like and so Yes, Yes, I'm hilarious.

  • Uh, I work in Germany at a company called County Ammo.

  • And we you know, the marketing way.

  • We like promoter Selves.

  • We accelerate data access.

  • Just means we help people understand their data and do all kinds of cool data science.

  • The things, um but really, for me, what I love the most about my work is thes people.

  • Oh, my God.

  • This is my team.

  • And they're some of the brightest, funniest, smartest.

  • Oh, my God.

  • I love them very much.

  • I'm very happy to work with them.

  • And, you know, I get to come here and tell you about how wonderful they are, but that, unfortunately, is not what this talk is.

  • About three set instead that talks about server lis service.

  • Honey, you've heard of server lys.

  • Um wow, everybody.

  • So nothing new will be learned here.

  • It I'm just kidding.

  • Um, we're gonna talk about, so I'm gonna look at some adoption.

  • Some use it.

  • Uh, well, look at what it means and why.

  • I think it is absolutely revolutionary, and I mean that the stock is called legendary.

  • Lamb does for reason.

  • Ah, but before we get into that, let's kind of look at some facts and figures.

  • These are from surveys.

  • The citations on the bottom left.

  • I'd encourage you to go look it out.

  • Look it up.

  • Um, and according to the new stack, some time ago, about 78% of participants in the survey wanted to adopt server lis into their ecosystem into their internal service architecture.

  • 70%.

  • Which means by now they may have already another.

  • Interesting to cystic is 75% of the problem space has been penetrated by service in the last 18 months or so, and that is really, really, really exciting.

  • It very quickly overtook, um, the prior art.

  • If you will containers as a service and it's it's growing very fast.

  • Um, and these are just numbers.

  • But there are real world use cases where Sir Valises doing amazing things and one such use cases by one of the biggest brands in the whole world.

  • Coca Cola reduced their operating costs by about 65% on service, and how they did that is they had these these vending machines that were like 10 to 12 years old and they would have to send some some telemetry.

  • They have a sense of information about like, Are they enough drinks and stuff to Coke?

  • Now these up until 2016 around on Amazon's Easy to on cost about $12,864 a year.

  • Okay, um, after moving to serve Earless, Coca Cola pays 4490 roughly at the time of the case study.

  • For you, that's That's a 65% reduction.

  • And if they could serve 30 million requests at one time at the time of publishing of this study, very, very, very interesting.

  • Second, if you've ever played or heard of final fantasy score, Enix is the company behind some of the world's greatest.

  • MMORPG is massively multiplayer online role playing games, and they could have millions of players on at the same time.

  • And what players like to do at massive scale is take screenshots, so they have an image processing function that would process thes screenshots.

  • Now, um, processing these screenshots, especially under heavy load, would use that used to take several hours after moving to serverless thes.

  • Take a little over 10 seconds and more than that.

  • Bacon comfortably do this with traffic spikes of over two times.

  • So if there's a gaming tournament and there's like millions of people online, this dysfunction to process the images eats it for breakfast.

  • Um, lastly, if you've ever heard of a used a WiFi router in your entire life, there's a good chance it with a Thompson one.

  • And Thompson router is one of the world's biggest brands for routers is able to process 4000 events per second on server Lys.

  • Um, what's really impressive about this is even, you know, under very heavy conditions there's no or minimal, probably no data loss.

  • Um, but for me, and this is what I want to focus on today, is they scheduled five months to get this into production, according to the report.

  • And, you know, if you're at the scale of Thompson, five months is a really short time.

  • I've heard people say that in larger companies things just take longer.

  • This more meetings, more planning whatsoever.

  • So five months is ambitious.

  • Thompson actually ended up moving this thing to serve a list two months ahead of schedule.

  • Two months ahead of server Lis is is you're able to get up and running really, really fast.

  • And so I tell you all these things to tell you it's here and it's big and people are talking about it.

  • But don't take my word for it.

  • I think one of my friends, Natter said it best.

  • He just says this server Lis is the future.

  • And so if it is the future and we're all kind of going there anyway, let's let's spend 30 minutes talking about it in a little bit of detail.

  • Uh, what is it?

  • What is service serve Earless in quotes?

  • Um, it's a loaded term, right?

  • It could mean functions, which we'll talk about what it could just mean.

  • Like static Web sites jam stack things.

  • Um, I want to make a point here.

  • That may be controversial, but you know what?

  • It's just it's you love me, you know, it's fine.

  • Whatever.

  • Um, I think just the word server Lis is kind of a life like in English.

  • That's called a miss No more miss, as in not not quite no more as a name now in software engineering, naming things is hard, and I think we may have missed Sir Phyllis because I was talking to my wife about it.

  • She's, you know, she studies law and isn't very involved in tech, and it sounds like, Well, there's no servers at all, which is a huge life.

  • And so I had kids.

  • I'd invite you to consider this.

  • It's a wireless charger.

  • Also kind of a life.

  • Um, and so server, This is kind of like that.

  • It means it doesn't mean there are no servers.

  • It means that they're not your problem.

  • You do what you love.

  • You get to focus on building the apse, the experiences you absolutely love, and you pay Amazon or site Ornella fire one of these some amount of money, or use their free here and your stuffs magically in the cloud server.

  • Lis, you can quote me on this server.

  • Lis brings the cloud down.

  • Um, but But why?

  • And there's a number of reasons why service is so popular and we'll go through them.

  • I think you know them.

  • They're they're not like rocket science or mind blowing.

  • But as I just said, you can focus on the things you love you.

  • You delegate the responsibility of servers to someone else.

  • So you write your job a script, you write your python.

  • If you're into that and and you put it in the cloud and the concerns of provisioning the server setting up the run time putting, getting it up, all of that you don't deal with.

  • If for some reason your server dies and you need to send traffic to a different machine, for whatever reason handled for you, that's already like I can breathe a little bit easier.

  • There were three scale like if if, for example, you your square enix and there's a huge gaming tournament, you don't have to think about scaling vertically by adding Maur memory or, you know, hardware throwing hard with the problem or scaling horizontally by adding more servers in your cluster, you just it doesn't matter.

  • You create the software you know in love and these air kind of the three tenants of service.

  • But I invite you to consider one more, and that is this one.

  • You see most tech conferences.

  • I think it may be fair to say all tech conferences air on this spectrum of technology and community.

  • Um and James Caan family conferences that I've been to usually lean a little bit more towards community.

  • And if we're in community, we need to talk accessibility in case that's not big enough.

  • Accessibility.

  • I think server Lis makes the Web.

  • So the clouds so accessible, um, that everybody can play and this is huge.

  • You see, I come from I was born in a country that is poor, it's a developing country.

  • India.

  • In case you were wondering, I love the food.

  • It's amazing, but the country is, is is it's growing right, and the prior art of service means you have to have some type of server means.

  • You have to either get shared hosting if if money's tight or you get a virtual private server or a bare metal server.

  • Now with these, there's money involved.

  • And sometimes for some communities, this money is too much.

  • It's not as accessible with Lambda Server.

  • Lis helps these communities see.

  • India, I think, is home to some of the most brilliant minds on the face of the planet.

  • And if you need evidence of that, don't you don't have to look very far.

  • CEO of Google, CEO of Microsoft So needle Pyatt, Facebook.

  • There's react.

  • India is happening right now, and there's a whole bunch of them ready to learn and grow.

  • The problem is, the cloud is not accessible because you pay so much for a server.

  • It's unbelievable.

  • Lambda has this pricing model that allows people from these communities to put stuff in the cloud.

  • You have an idea, put it in there, and how that's possible is because Lambda as our functions, they're just function.

  • There's an excellent talk yesterday on functional programming, their functions and as we saw functions, what do they do?

  • You write them, they start, they do a job.

  • They run to completion, they finish their turn a value.

  • That's it.

  • In a perfect world, your functions, they're stateless.

  • Or doesn't the clothes over their own state?

  • They have no side effects, and they're pure meaning.

  • For any given input, they give you the same output without side effects.

  • They're predictable.

  • If they're predictable, they're testable.

  • You can unit test them and make sure they behave the way you won't look too.

  • If they're testable, they're scalable, meaning you can run 50 different versions of the same function.

  • They do the same thing, and if they're infinitely scalable, you can go really, really, really far.

  • And so if the functions are invoked, it presents an entirely different pricing model.

  • Upright.

  • Previously, you would pay monthly for server people.

  • Use it.

  • People don't I don't care.

  • I'm losing 350 euros a month is paying for a server.

  • I'm wildly unpopular.

  • I'm paying for a server.

  • I can't pay my rent, but maybe someone will use it and it'll catch on.

  • I'll just keep hoping.

  • Pay for this.

  • I lose money a whole year with Lambda you pay for each time your function is invoked.

  • And I think that's a game changer.

  • I really do, because if no one uses your thing, you pay nothing on some wood.

  • Some providers, if one person uses it, you pay.

  • I think, like 15 millionth of a dollar.

  • You pay for invocations and you pay for compute time.

  • We'll talk more about that.

  • It's it's almost free and it's almost free.

  • No matter how much money you or your country has, you can put stuff in the flat thing.

  • That's incredible.

  • And so I thought I could show you some examples if the Internet cooperates.

  • If it doesn't, you know, uh, we'll see what happens.

  • And so here's an example.

  • Here's a serverless function in JavaScript because it's Jace conference.

  • I'm I'm I'm sending So I have a request response.

  • Ah, set of arguments and I'm sending through the response a heading off the current date and time.

  • Uh, let's put this in the cloud.

  • I want you to pay attention because this is super sophisticated to put stuff in the cloud on surveillance.

  • I'm using a provider as there's others, but I find this and I just enter one commend literally one comment and it says, Okay, I'm deploying this function.

  • Damp ban.

  • It's done, uh, I open it.

  • And so what happens is that functions called when I open the function and it returns a value to the brother.

  • This calls the function that calls the function.

  • If I reload the page, it calls the function every time you see, and so I'm kind of running up a huge bill hereby reloading this page.

  • But it's not very much.

  • I'm sure I can afford it.

  • I'm actually on their free tier, so I'm sure I could afford it.

  • Um, but another case for server list If you wanna get creative, is this thing which will automatically give you a randomly generated thinking emoji on every invocation?

  • This one apparently is huge.

  • But every time you reload the page, you just get a new one, because the function is called on many.

  • I could do this all day.

  • Okay, moving on.

  • Um, So what we saw was cool.

  • I don't know why my doc is showing like apple.

  • If you're watching, please quality.

  • The pricing for that, though, is that much you pay for each time it's invoked.

  • You pay that much and you pay for the amount of gigabytes seconds you use.

  • And this is on Amazon Web service.

  • That's that's crazy.

  • But I spend a lot of time in a lot of tech.

  • Conference is talking to a lot of people about things, about service, about CSS, about life and love, and Romeo and Juliet, Whatever.

  • Um, and as with everything, there are concerns.

  • Can it be that good?

  • Is it too good to be true?

  • Uh, the two big concerns I hear often our acceptance of serious of a traditional problem with service is denial of service.

  • You get hit with too many requests and then your servers like I can't I can't deal with it.

  • I quit.

  • And and it denies service.

  • What would serve a list since its scales up the other side of the problems?

  • True, if you pay for each time you're you are Ella's accessed.

  • Someone could just, like wild True and then fetch you every time.

  • You know what I mean?

  • And then they just send request after request, and then you get a bill of $2 million.

  • Um, how do you deal with that number, too?

  • If you say Tae Jin's.

  • If you say our lab does need to be stateless and pure, but I need stay.

  • I need a database.

  • How do I do this?

  • We'll talk about those.

  • Let's start with acceptance of service.

  • Um, let's go down a hypothetical journey.

  • Okay.

  • Let's say your worst enemy sets up a network of 100,000 bucks.

  • Hey, um, and they just, like, keep hitting your server for 60 minutes.

  • One request per second.

  • Okay, that is a total of 360 million in an hour.

  • Lambda and vocations.

  • Big numbers, big and scary.

  • It's even physically big on this cream.

  • Um, so that's a lot.

  • Let's add to the equation.

  • Let's say your function takes 200 milliseconds of compute time and uses a gigabyte of memory.

  • This is these air just factors for a little experiment.

  • No.

  • One Amazon Web service is with Lambda.

  • This is This is what you would pay.

  • It's a two digit number for invocation, $72.

  • And since it's one gigabyte of memory over 200 seconds, it's $1024.

  • All of that to say in an attack at that skill, you pay less than $1500.

  • You would pay that anyway, on virtual on, like previously server based solution, if not more so.

  • This is an extreme case, and I think if you're an early stage startup or late stage startup or whatever, that's within reach.

  • And if you're attacked at that scale, you're probably famous enough to afford it.

  • Maybe, and there's at their service.

  • Is there something called Amazon Cloudwatch that will kind of watch your thing and warn you if you're getting too much craft, so there are measures you can implement to prevent this?

  • I think that's pretty okay.

  • Second problem.

  • The need for ST.

  • I need state.

  • We all need state.

  • That kind of can't have a nap without state.

  • How do we do this?

  • Every time I talk about Sarah?

  • Listen, someone I don't know why, but the thinking goes to like, Okay.

  • Everything needs to be a function.

  • Everything.

  • Everything's pure, every.

  • Like, we love this in Java script.

  • Like someone says something.

  • I got to be really move everything to Gatsby.

  • But someone says something about react.

  • Hooks were like hooks everywhere.

  • Classes are evil, you know?

  • Like everything doesn't need to be a function.

  • Okay, um, use it in moderation.

  • Use it with consideration.

  • But perhaps you have a mongo D B atlas or poseur.

  • A database on postcards.

  • The graft.

  • L A p I somewhere Great.

  • Maybe there's a place for your function at the A.

  • P I layer.

  • Maybe there isn't.

  • But not everything needs to be fun.

  • Yes, you can have your state.

  • You can even eat it if you want.

  • Um, you didn't understand that.

  • It's okay.

  • Uh, not everything needs to be a function.

  • And so that brings me to the the study.

  • Part of the stock was to discuss briefly a study that I did literally for science.

  • I have to preface this multiple times by saying, Don't do this.

  • It's what?

  • It's one of those things that you do because you're curious.

  • Don't do it in production.

  • Um, there's better ways to do this.

  • Jam stack is a good way.

  • Other ways too.

  • But anyway, I wanted to try out server Lis Button, But, like with you, I, um So it took some frameworks and did some stuff with Ramdas, and we'll see some results.

  • But how do you do?

  • Like Ah you.

  • Why, as a you know, So you have you i on and you want to put it in the lambda.

  • Uh, what is your you I have function?

  • Or is it like Dave's and stuff that I've heard?

  • Actually, no.

  • Wait a second.

  • You you y is a function of your state like this is like the e equals m c square of you.

  • Why development isn't like u u y is a function of you.

  • Okay, I don't see that.

  • Wait a second.

  • That means you I could just be a lambda.

  • Whoa!

  • Okay, cool.

  • Let's try it.

  • Let's right, like react dom dot Render to string on on 11.

  • And that's exactly what we did.

  • We, um, server rendered, but you Why, um, as a server List number.

  • So I guess that's that server lis server side rendering is that S s s r u S s No, make that.

  • Mmm.

  • Seriously, to study anyway, to study We, um we looked at some leading U Y framework, so we looked at pre act react lit html.

  • That's the polar micron.

  • Please don't be offended.

  • Um, View and VH team also five frameworks and what we wanted to do was we wanted to build a Reddit clone, and this is just a clone of reading.

  • It uses a flat file is the back end.

  • So we don't really talk to their edit a p I.

  • And that's to reduce Leighton, See, and and the standard deviation on the benchmark And the reason we chose service are entering.

  • There was an excellent talk yesterday, actually, by Surma and Jake about they talked a little bit about suicide rendering.

  • It's just better if you're not serve a rendering.

  • I think it can be better in many cases, partly because everything's delivered to your users in one network.

  • Hop like they visit your page them, they get exactly what they want.

  • There is no like, Oh, my JavaScript bundle is loading now, like some of you have seen these experiences on the Web, where you get a shell of a nap and then spinner's immediately, um, service side renderings gives them what they need.

  • Um, it's also better for you in terms of search engine marketing I've heard at, I think at Google, I o actually, that it's significantly harder for search engines to index these things.

  • Um, as well.

  • Users don't like spinners, but one benefit you get actually would serve Earless and server side rendering again.

  • Probably not a good idea.

  • Um, is you get analytics out of the box, so imagine you have a blawg and each articles deployed as a lambda.

  • So if from article A your surveillance bill is $2 an article, be your surveillance bill is $200.

  • You kind of know what kind of content to be producing from there?

  • I thought that was interesting.

  • Um, but but regardless, so this is what we wanted to build.

  • Um, and we wanted to track three key metrics.

  • Metric Number one is the cost of booting the function cold.

  • We'll talk a little bit more about this because it's a little bit exclusive.

  • Thio Serverless Number two is the lamb, the size or the package size.

  • This is a concept you're aware of, and number three is the throughput.

  • Honey Requests for a second.

  • Can this handle?

  • But let's talk a little bit about cool boot with lamb does how they execute.

  • Is there these ephemeral containers like somebody accesses your girl and then a container spins up and PM instills JavaScript, ecosystem and PM installed.

  • Your dependencies gets ready, then executes the function with importing your NPM modules returns a value.

  • There's a bit of ceremony there, and that ceremony can take time.

  • That's called a cold boot.

  • This is best visualized with this graphic here.

  • Eso request comes in and it downloads your coat and boots, and then your code runs.

  • Once your code runs everything, the function is considered hot or warm.

  • It's ready to serve more requests, and it stays hot for a while.

  • Um, but after a while, they wind down and go cold again.

  • There's a typo in the text.

  • I hope you don't see it and after a while.

  • So that's kind of how cool boot work.

  • So he wanted to test which framework required.

  • The most resource is to start.

  • Um, let's talk briefly about the code involved.

  • The code for all of the Reddit examples with React React on and the H team, I believe, was the same coat exactly the same code.

  • How do you do that?

  • Using this thing?

  • How many of you have heard of hyper script I can count on my fingers like five hyper script is kind of just a function signature, and it's the function signature of something Year may be more familiar with react dot create element.

  • It takes the tag of the element, the props and the Children and pre act on VH email.

  • They implement this interface as well.

  • And so what we ended up doing was you just kind of a sign hey, JJ to one of these three and the rest of the code stays exactly the same.

  • It was really cool for a view and lit html.

  • We had to take a different strategy because with view, you work with view instances.

  • Um, but we ended up copying the HTML string rendered and just interpreting with some view there.

  • It's important to note here, let HTML doesn't have, um, a note implementation.

  • I don't believe so.

  • We used a fork.

  • And so the moment we've all been waiting for, the talk is almost over.

  • Now we're gonna look at the results of this briefly.

  • But again, as I said in the beginning, there's better.

  • We do not have like a blogger, that's all.

  • Lambda.

  • So I'm not sure that's the best way to do it.

  • Um, the But if we're talking about the fastest, the best framework on service, the answer is very much.

  • No, don't Don't do it.

  • Don't do it.

  • Really.

  • Just use the platform.

  • It's amazing.

  • Seriously, you could even use the jam stack or like vanilla.

  • Seriously, like just don't don't even anyway, uh, have I made my point?

  • Um, but the results, regardless, in case you're curious about numbers are, um, VH.

  • Timo is really fast, and you just write JSX.

  • It looks just like react.

  • But what ends up happening is it doesn't have a virtual dom, and that's why it's fast.

  • What it does it takes.

  • You're like J S X and turns it into a string of HTML that you can then return from your Lambda.

  • And because of this super fast and super light coming in at 28 kilobytes pre act naturally was the second.

  • In fact, if you chose to do this for, you know, for science, you could serve a render the HTML hydrate with pre act, and you have a really fast and pleasant experience.

  • Unfortunately, coming up at the rear was view, which I don't think it's necessarily a bad thing.

  • The purpose of this study, I kind of like the purpose of this talk is to get us a little bit more.

  • Sears about service right is to get us to think about service a bit more.

  • It's not to bash view, but it's to think about service.

  • And so I was actually, quite.

  • I feel like my work on this was was it was vindicated.

  • Was was successful because the author of you I noticed the study and responded with Maybe we'll revisit this when we optimize for service in the future of view, and this is the goal.

  • I think because, as Natter said, it's so eloquently.

  • Serverless is the future because it's the future I think we all should start thinking and optimizing for service in the new future.

  • With that, let us lend this plane has been a beautiful journey.

  • I enjoy talking.

  • I could talk all day.

  • I'm not sure if you could listen all day, so we'll we'll wrap it up.

  • We talked about how server Lis is the future We looked at.

  • How Coke, Cola Square Enix.

  • Others are really benefiting for service.

  • We looked at how community matters.

  • See, I feel like some of you may not get the weight the potential of service to be truly legendary because for most of you, you do your job and get push and your ops team puts it in the club.

  • You're off steam, though.

  • Maybe losing sleep.

  • Your company may be spending more than necessary.

  • Um, maybe it doesn't directly impact to you, but I truly believe server lis has a positive impact on your company and your team.

  • But by and large, the positive impact, as I mentioned, is in poor communities.

  • Financially, it brings the cloud down.

  • It makes it responded, makes it available to the rest of us, and I truly believe through these legendary Lambda as We're going to see the world change for the better.

  • We're going to see India develop and other nations like it develop faster that lead to human satisfaction, happiness, joy and flourishing.

  • And with that, I want to say thank you very much for your time and for listening.

get a good get getting by.

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

B1 中級

傳說中的羔羊,Tejas Kumar|布達佩斯2019年JSConf大會 (Legendary Lambdas by Tejas Kumar | JSConf Budapest 2019)

  • 2 0
    林宜悉 發佈於 2021 年 01 月 14 日
影片單字