Placeholder Image

字幕列表 影片播放

  • Good morning.

  • Uh, hello.

  • My name is Kyle Oba.

  • Uh, thank you to all organizers and volunteers for You know, this has been a amazing experience so far.

  • Very ex.

  • I didn't be here.

  • You can probably tell him a little bit nervous, so I know I'm just gonna dive right in.

  • All right, So I work in a company called Pot au Chocolat.

  • We work on local local problems with local clients when you try to find local solutions as much as possible for a research and design and development company.

  • Uh, my partner and I started this company before moving to Hawaii, but she's actually frank.

  • Y eso um, that's where this whole, like local centered design and research comes from.

  • We partner with other researchers, designers and programmers, but enough about me.

  • Um, I'm sorry.

  • You can't see my slides.

  • Oh, no.

  • Um, should I just try to restart that?

  • Okay.

  • Oh, boy.

  • Okay, cool.

  • Thank you.

  • Thank you.

  • Uh, that's my presentation.

  • Uh, he noticed a thing, uh, used to work.

  • Okay.

  • So cool.

  • Um, all right.

  • So diving right in.

  • So the unprecedented.

  • So Shoshana Zubov's recently published a book called The Age of Surveillance.

  • capital of him.

  • I'm not gonna pretend to have read the whole thing because it's like over 700 pages, and I'm a busy person.

  • But, um, yeah, so that's his idea of the unprecedented and in her own words, the unprecedented is necessarily unrecognizable.

  • So when we encounter something that's unprecedented, we automatically interpret it through the lens Is that we're familiar with, or the of familiar categories thereby rendering invisible precisely which is unprecedented.

  • When I read this housing.

  • Wow, this is amazing.

  • Um, I originally was going to center this talk all around contacts, but I really like her idea here.

  • Um And so if you were living or visiting Hawaii about a year ago, exactly almost, you would have received this on your phone.

  • It's all caps, so you know it's important.

  • Ballistic missile threat inbound to Hawaii.

  • Seek immediate shelter.

  • And then if you were like, is that really?

  • It says this is not a drill, which you know.

  • Usually I'm people are drilling beginning eight in the morning on Saturday.

  • But whatever.

  • Uh, yes.

  • Oh, from my own experience.

  • You know, I saw this.

  • You know, it was Saturday morning.

  • I just kind of calmly like doing stuff in the kitchen.

  • Look at my phone, and I guess I'm gonna die, right?

  • And, uh, you know, usually when I see you know, we're all gonna die, But, you know, But when I see this type of message on my phone and maybe some of you got this last night, but it's like there's a flood coming.

  • Maybe you know, there's a hurricane on its way.

  • Maybe maybe you should care.

  • Maybe right.

  • And so that's kind of the lens that I used to interpret this message, which is kind of like with Zuba office saying it was totally the wrong lens to use.

  • It was just a lens that was familiar to me.

  • So I immediately started filling my water filter in the kitchen with water, which seems reasonable.

  • But like when you within, like, 38 minutes later, when they say just kidding, just just did.

  • That's not Riel.

  • I really, like, had time to evaluate, like, my own behavior and and it really didn't make any sense.

  • I mean, like, if we get hit by a missile eyes pouring water, I mean, I don't know, didn't really matter.

  • Okay, cool.

  • So that's the idea of the unprecedented how we require these lenses in order to interpret new things.

  • And so obviously this talk with a little bit about artificial intelligence, which is kind of weird, broad term.

  • So, like palaces.

  • House is relevant to us today.

  • So as builders and designers as a lot of us are, we need to create that new lens for ourselves in order to interpret that which is unprecedented to us, into the community or into society.

  • Um, yeah.

  • One of the common themes to a lot of the work that we do is that we'd like to try to find the UN relatable things that are invisible things that are undefined and try to make them more relatable, visible and articulate them more fully.

  • And it's only through the creation of these new lenses that we can then further these discussions with our communities.

  • So, as an example of that, I wanted to take you all through a project that we did with the Honolulu Museum of Art, which is just down the street.

  • Uh, and I highly recommend it.

  • Um, so about the time that that alert came out, we had the opportunity to do this project with the museum, and we called it a design intervention, and we wanted to sort of forward the discussion of surveillance technologies.

  • So at the time, they had this program called Classified, that was produced by the Doors to Theatre, which is in the museum, which was to highlight screening.

  • Uh, well highlights of the program we're screening of films including Lower Pa Tresses, Citizen for documentary about Edward Snowden.

  • And there was also a panel discussion.

  • So the panel discussion took place on, you know, just one particular day involved Kate Crawford, one of the co founders of the analysis, to Trevor Paige.

  • When an artist who works with surveillance technologies Laura, Pa Trice, the director Edward Snowden actually Skyped in.

  • And Ben Weiser, who's his a c L.

  • U attorney, and it was moderated by Hasan Elahi, another artist who works with surveillance technology.

  • Oh, jeez, or surveillance in general, um, and so we were invited to sort of come and create an experience.

  • And so we dubbed it a design intervention to complement what was happening in the panel.

  • So So the museum called it the my profile tour.

  • Um, I'm not sure how I feel about that name, but you know that it's stuck.

  • So that's what it is.

  • It's my semi pro fire to my profile.

  • Picture is a little It's a little 2000 for a name, I think a little bit, but it's cool.

  • Uh, and, uh, so the purpose of our of our designer invention was to sort of raise raise awareness about things in you that can go wrong.

  • So the fallibility of face detection, face recognition technology and problematic data privacy issues.

  • So this is, like, about a year ago?

  • Um, so we weren't Well, I'll just leave it at that.

  • So we weren't really sure what was gonna happen with those issues, but we knew they were problematic anyway, eh?

  • So this was this was mentioned to use public to specific technologies and the use cases of machine machine learning in the same giant.

  • Amorphous A, I think on to replace sort of those marketing buzzwords with some a little bit more articulate discussion and specific discussion about these issues.

  • All right, so what did we build?

  • So we built individuals is, I don't know, elevator pitch that we built in vivid, individualized tours based on guests faces matched are currently on display in the museum.

  • Okay.

  • And I know a lot of you are thinking Isn't that what Google did in their app for Google Arts and culture?

  • And now I'm not lying when I tell you this, and you know that that's what liars say.

  • But anyway, we stay.

  • They released their out, like, two weeks before we went live, and it was just sort of uncanny.

  • I actually didn't find out about their capital after we after the panel, and I was like, Oh, wow, that that's disappointing.

  • But at the same time, it was really interesting because they were doing something very, very similar.

  • I can, So I'm gonna kind of walking through how we did it and some of the differences.

  • But I think it was an uncanny coincidence, and I think a lot of it had to do with how good certain types of technology, like face recognition and detection we're getting at about that time just and how easy it was to apply.

  • So, um, all right, so cool that my Provo tear, how did it work?

  • So, Step one, you had to opt in.

  • Not everybody had to do this, uh, you can imagine at a panel discussion where Edward Snowden skyping in in the state of Hawaii, where he used to work.

  • It's a little bit weird to have somebody ask you to take a picture of you before you go into the panel.

  • So you had to opt in, and then if you did often, we took a picture of you and it said, Hey, this is the picture we're taking.

  • Would you like to opt in?

  • Uh, if you did, we kind of labeled you in the real world.

  • We gave you a sticker and a number on there was sort of, like, sort of a symbolic.

  • Like you're this is happening to you?

  • Uh, yeah.

  • I mean, you know, let's be fair.

  • Not every website does this.

  • Okay, well, they don't give you a sticker.

  • Well, actually, the stickers are thing, but anyway, Okay.

  • Uh oh, yeah.

  • Incidentally, asked me later.

  • I have stickers about, uh All right, cool.

  • So the four step customized tour.

  • So the panel was an hour long.

  • So before you walked in, we took your picture.

  • When you walked out, we handed you a customized tour.

  • So where there was a lot that went into sort of printing and formatting and laying these things out ahead of time so that only the live customized content would be applied.

  • Uh, cool.

  • So the process.

  • So I know this is a little small for you.

  • See, I'm sorry about that.

  • This is sort of like version from the ah version of what we put out in a handout.

  • And so basically the top part of it is stuff that happened ahead of time.

  • And the bottom part is stuff that happened on site that day, and I'll walk you through that.

  • So step one of I think almost every one of these machine learning slash you know, whatever you wanna call it, artificial intelligence projects is data collection, right?

  • Or getting data from somewhere.

  • And so, as you can imagine, when you're working with the museum who hasn't really done this type of thing before, data collection is hard.

  • Data collection is hard.

  • I in my experience, no matter what project I'm working on, so we had to walk through the entire museum, find out where all the art waas write down all the numbers associated with that art and figure out what they were called, what room they were in and then get a photo of everything that was that was a lot of work.

  • And then we sort of processed the images of the art as well as the sculpture.

  • So anything that they had that was gonna be on this display at that time, that wasn't in sort of like a special exhibition where we weren't allowed to photograph the art.

  • Um, we pass it through a face detection neural network, which is essentially just says, Hey, I think there's a face here and I think it's inside of this rectangle and then you can highlight that or crop it out.

  • So in this case, we just cropped out the faces, and then we used a second neural network, which was a landmark prediction neural network, which essentially says, Hey, we're I think the eyebrows are probably based on you telling me this is a face.

  • I think this is where the eyebrows are.

  • The eyes, the nose, the mouth of the jaw, et cetera, and the image on the bottom there, which is me.

  • I just wanted to throw in here that although we did a lot of this in python because the beginning of every Web project using machine learning is to install python on your machine.

  • Uh, this is something that you can actually do in the web.

  • So I wanted to put this link here.

  • This is an example that runs in ML five gs and is rendered with P five gs or, you know, sort of like put into the browser with P five gs.

  • It's super cool.

  • So if you're like if you love javascript, you can do all of this stuff from the comfort of your JavaScript.

  • But just remember, you have to sell Python because that's just required.

  • But don't be afraid of iPhone.

  • It's just executed pseudo code.

  • All right, Uh, next.

  • Let's see.

  • That's a joke for the pipe on people.

  • All right, so this prepares us for a new miracle conference comparison so we could actually pass that set of landmarks to a face recognition neural network with the men essentially render that image into a bunch of numbers like Big Vector, right?

  • And so now that we have to face and we know where the landmarks are, we've turned it into a bunch of numbers.

  • We can start matching into things.

  • Okay, so, like, on the stable, that stuff away.

  • And then on the day of the event people come in, we take their picture, right, and then we find the landmarks on their face, and then we can then render them into numbers and then match them to the art that they're similar to Cool.

  • And then, since we have faced scary facial recognition software, we can then say, Hey, do you want your tour?

  • Oh, we know who you are.

  • So here's your tour.

  • Right.

  • So that was another thing we had, Um, so it wasn't just about matching faces to art.

  • We also did a number of other things.

  • So this is the tour folded up that we handed to them.

  • So it has their face, it hasn't landmarks.

  • And it has a number of super problematic classifications that we said We're attributes of you based on your face.

  • And this was not meant to be like True is meant to be, like, super problematic and something that people should.

  • Question s so special.

  • Thanks to Kyle MacDonald, who's an artist, um, who trained the neural network to work with these classifications on the label faces of the wild data set.

  • Um, it's another one of things where you can pop that thing in the anyway.

  • I won't go into it.

  • Um, so these labels were super problematic, and I think in some cases, kind of insulting it might tell you you're not attractive, right?

  • And some, uh, it might tell you, you look tired, you know, stuff like that.

  • And you know, that's not something you want to hear from computer anybody, But nonetheless, computers will do that.

  • And so the next page, we actually, this is the stuff that was besides, the labels was custom generated, so here were able to say, Okay, we think you look like this painting.

  • We think you look like this sculpture and then based on everything that was in the frame when you took your photo, we match it.

  • We're matching you using a different neural network that does object matching.

  • And I'll go into that a little bit later.

  • And then because of your classifications that we gave you were going to say, this piece of art is something you might be interested in based on what was strongest in your classifications and then and then we created a path through other pieces of art that sort of like provided this sort of like trail that you could follow from one our piece of art to the neck.

  • This was an educational project.

  • So what we really wanted to do was to provide people away to question the technology and have it be personal so that people could have further more articulate discussion in the community.

  • So we also wanted to highlight where our technologies failed.

  • I'll go into that a little bit.

  • Um, and then also we provided details about how the R R was chosen very technical level if people wanted to do to get mad as well, so cool.

  • So one of the things that we really wanted to do was to say, you know, what are the ways that we can use to find art that we can match to people?

  • And so one of the techniques that we came across with something called Reverse Image Search and so reverse image search essentially taken image.

  • Use an object detection neural network, and you kind of short circuit it to provide you with numbers instead of names of things.

  • And then, with those numbers, you can sort of compare them to other objects and find in the data set other objects that look the same.

  • So just kind of like that, that picture clean itself.

  • One of the object of the Noron Agassi used was called V G 16.

  • It's 16 layers.

  • You slice off the last layer that basically tells you what the thing is, and you're left with a giant vector for every object.

  • That's numbers.

  • Um, here's another example of finding stuff that's similar to the base on the left.

  • So once you've got these numbers, you basically have coordinates in this multi dimensional space, and you can use traditional uh, distance algorithms to see what's close toe wet.

  • And then once you do something called PC analysis or principal component analysis, you can slam 4000 dimensions down into two dimensions, using this algorithm, which essentially maintains closeness of points in multi dimensional space in the two dimensional translation.

  • And then once you're in two dimensions, you can use your traditional graphing algorithm or graph algorithms to find shortest chaps like you would to find a route through the city.

  • So in this case, we have a picture of a person you may have heard of him and the and the vase on the right hand side.

  • And we tried to find items in the museum collection that would sort of like transition for one to the next.

  • I'm not sure how convincing this is.

  • I kind of think it's convincing.

  • Maybe it's like bias on me for selecting this, but anyway, um, you kind of get the picture.

  • This is how we selected the art.

  • Uh, so lessons learned, Ah, we had to try a lot of different ways to try to find things that would work to Matt to convincingly match you to something that's in the art, right to a piece of art.

  • And what we got to this point, we're like, Yeah, that's pretty good.

  • Zuckerberg looks like that guy.

  • So Okay, we're like, we're done.

  • All right.

  • Uh, but there's also mistakes that are made, and I should've put, like, a warning or something on the previous slide so that we started out with this thing called the open Sea vi har classifier to find faces.

  • Turned out, it's not that good.

  • Or at least it was good for a while.

  • But then when you know these convolution.

  • All neural networks came out.

  • We ended up using one bite called the lib, which hasn't put that pipe on a P.