字幕列表 影片播放 列印英文字幕 [MUSIC PLAYING] JON MALMAUD: Hello, everyone. My name's Jon, and I'll be talking today about TensorFlow.jl, an interface between Julia and TensorFlow. So what is Julia? Some of you have probably never heard of it. It's a really dynamic yet performant programming language developed at MIT since around 10 years ago. It's a fresh rethinking of what a scientific programming language can be like. And version 1.0 was finally realized last year to this cheering audience from JuliaCon. So I'm going to give you a whirlwind tour of what Julia is and then talk to you about TensorFlow.jl. So here's a very simple Julia program. I think anyone with a Python background can guess what this is doing, except this @show business, but that's actually a macro. And what is Jon? I'm finishing my PhD at MIT. I do machine learning by day and open source software by night and sometimes by day, too, to the chagrin of my advisor. So why consider Julia? For one thing, it's got an ultrafast just-in-time compiler. So here I'm running an autoregressive process using for loops, which I think is the most natural way to express them. And why don't we just run that for 100 million iterations in Julia and Python. And you can see the syntax looks pretty similar between them. Well, Python takes about 13 seconds. C takes about 0.9 seconds. Julia takes about 0.95 seconds. Not bad. And you have powerful metaprogramming. So anyone from a LISP background will appreciate this. You can write a macro that takes in Julius syntax and outputs Julia syntax. So here's just a trivial toy macro that checks if a number is less than zero and prints out a warning. And note that since it's a macro, it can print out the name of the variable and change the variable in place. You can't really do that with a normal Python function. And there's this really nice multiple dispatch system where you can define multiple versions of a function, and the one that's called depends on the types of all the arguments at runtime. So here's a really cool rock-paper-scissor example that someone from the community created. And what's nice is I can say that for any shape, if shape A beats shape B, then shape B loses to shape A. And I can encode that information with one line. And Julia's growing fast. We've about doubled in GitHub stars in the last year alone. We're probably above 20,000 now. Things have really started to take off since 1.0 was finalized. And now TensorFlow.jl. So here's a quick glimpse of its syntax. It's clearly inspired by Keras and the whole Tensor 2.0 world. We're now eager by default, but graph mode is available. If you choose to use graph mode, macros will help you out. So here a macro is transforming native Julia, the control flow, with this while loop into a graph mode while loop. And macros can do other nice things for you. Here we're visualizing a program with TensorBoard. And note that the labels on the nodes are automatically inferred from the variable names, which is what macros enable. Macros can do other cool things. So Francois of Keras fame posted this really nice example of implementing a model in Keras. But someone replied to the tweet, being this is really pretty, except why are there all these x's? I wish we could eliminate those. And I've highlighted those in red. And luckily, there's a Julia macro that will automatically thread a variable through a program, and so we don't need those x's in Julia. I think that's pretty elegant. And you can benefit from really fast preprocessing. So if I want to tokenize my corpus, maybe it's got 100 million tokens, and I have a custom tokenization scheme, I can just write that as a for loop, and I can trust Julia's just-in-time compiler to do the right thing. You don't want to write a for loop in Python that goes over 100 million elements. You can try. If you're worried about leaving Python, don't be. We have lots of good scientific computing languages right in Julia. If you need to use Python, we have a very good Python interface. So here's an example of calling into ScyPi. If I didn't tell you those last two lines were Julia, you'd probably think you were just writing Python. It's very simple. And we're compatible with the whole TensorFlow ecosystem. TensorBoard works. Note again that the label on the TensorBoard graph is automatically inferred from the variable. You can save graphs in Julia, load them in Python, and vice versa. And soon you could do that for save models as well. So if some of your collaborators are using Python, don't think you have to switch to Julia. You can use Julia yourself and still work with your Python friends. And someday they might use Julia. If you want to define a custom operation and use Julia's just-in-time compiler to make it fast, you can do that. Maybe you want to write a ray tracer. You can use Julia's really easy C form and function interface, if that's a part of it, and define a gradient in that. And just a little case study. You can even use Julia's differential equations package, and that all just works because of Julia's multiple dispatch. So if you wanted to do a [INAUDIBLE],, like you might have seen at [INAUDIBLE],, that's very simple. All right, so we still want to do a lot more Keras. We need your help. Come check out our GitHub, come join our Slack chat, and download Julia today. So I just want to thank all my collaborators, especially Lyndon, who's in the audience, and everyone in the TensorFlow world. And I want to thank you. [APPLAUSE] [MUSIC PLAYING]
B1 中級 TensorFlow.jl:TensorFlow世界的Julia前端 (TF Dev Summit '19) (TensorFlow.jl: A Julia Front End to the TensorFlow World (TF Dev Summit '19)) 2 0 林宜悉 發佈於 2021 年 01 月 14 日 更多分享 分享 收藏 回報 影片單字