본문 바로가기

Tensorflow

tensorflow dev summit 2019 요약

구글에서 3.6~7 일간 샌프란시스코에서 Google tensorflow dev summit 2019를 개최했다. 현재 텐서플로우 홈페이지(tensorflow.org) 홈화면에 33개의 영상이 있는 유투브 채널로 연결된다. 너무 많은 내용이 변경 되었고, 전체적으로 구글이 추구하는 바를 새로 표현한 느낌인데, 요약이 없어 눈물을 머금고 일일히 요약했다. 


구글이 추구하는 방향은 Tensorflow extened 에서 상위 개념을 나타내는 workflow라는 말로 알 수 있듯이 머신 러닝을 통해 A.I를 만드는 모든 과정을 정리해서 모든 과정에서 구글의 플랫폼을 사용하게 하겠다는 뜻을 표현했다. TFX의 자세한 내용은 tensorflow 홈페이지(https://www.tensorflow.org/tfx) 에 잘 나와 있지만, 바로 TFX를 접하면 이해가 어려우니, 이를 이루는 하위 개념들을 먼저 보는 것을 추천한다. 




시간이 되는 데로 하나씩 요약해 보도록 하겠다.


아래의 내용은 Google dev Summit 2019 내용 요약한 내용입니다

 

2.0 preview

  • High level API: TF Keras 
  • TF Keras API is more pythonic
  • Keras code and TF 2.0 code is same
  • Support converter to TF 2.0
  • TF 2.0 alpha is released 
  • Debugging with eagar mode
  • TF HUB
  • Integrate TF profiling to TF board
  • Optimizer and loss function list
  • 1 version of LSTM kernel can also run on CPU and GPU.
  • Easily customized.
  • Data parsing: Tensorboard
  • Going big: multi gpu strategy
  • Scale model without any change
  • Export and save model: Same as Keras
  • Coming soon: multi node synchronization
    • Coming Soon TPU too
    • Higher and higher model

 

TF funciton and autograph

  • Graph is useful.
  • Tf.function() 
  • Never run session mode
  • Pythonic function
  • Python semantic
  • Not all hidden
  • C.graph
  • Never control dependancy
  • Variables in tf.function
  • Never use tf.global_varialbes_init
  • Control flow in tf.funciton
  • Operator overloading: TF cannot override __if__ 
  • Autograph: can override __if__

 

TF dataset

  • Data is a little out of step
  • Data and model together.
  • Tfds
  • For inputs, targets in train_ds.repeat(10):
  • Import tensorflow_datasets as tfds
  • Train_ds = tfds.load(‘….’)
  • DatasetBuilder
    • Download_and_prepare
    • As_dataset
    • Info
  • DatasetInfo: document for dataset, num class, data structure, supervised keys
  • Support numpy usage too
  • Over 30+ dataset wi
  • Make your data famous
  • Com help on github

 

Tensorboard

  • Google colab
  • Tensorboard callback
  • Showing tensorboard in colab
  • Hyperpameter tunning
  • Hyperparameter comparion via models
  • Select model using metric(accuracy)

 

Tensorflow.js

 

Tensorflow collaboration and community

  • ~49,500 commit
  • Over 1,830 contributors
  • Growing ecosystem
  • Making contribution easier
  •  A more modular Tensorflow
  • More places to contibutes: Documents, test plan, blog
  • Tensorflow RFC: Request for comments are our key way to communicate design
  • Special interest groups 
  • Started with SIG Build, now we have six SIGS
  • SIG Addons
  • Additional ops, layers & more
  • SIG Build: Build, test, and packaging
  • Community contributed builds for other architectures(IBM, Intel MKL)
  • SIG IO: Connect TF to other systems
  • Apache Ignite, Kafka, Arrow, Parquet, Haddp, AWS, 
  • SIG Networking
  • SIG TensorBoard
  • Tensorflow World: 28-31 Oct, Santa Clara, CA
  • Tensorflow.world

 

In Codice Ratio

  • Technology and hummnity together
  • Vatican secret archive: Size and contents 
  • Large scale learning 

 

Tensorflow HUB

  • Less data
  • Less domian expertise
  • Share without code dependencies
  • Module: Pre-trainined building blocks
  • Integrated with core
  • New saved models
  • Keras layers: Pre model + new model 
  • Publish models
  • Tfhub.dev
  • Cross linugual: New unversal sentence encoder
  • AutoAugment
  • More modules & more tools
  • BigGan, TranslateGAn
  • Search Tf2 moduelsl

 

TF Agent

  • Robot learning to walk
  • Reinforcement learning
  • Implementing RL is hard
  • TF eager, tf.keras, tf.function, module …
  • Learning to walk in a nutshell
  • Parallenpyenvironment
  • Qpolicy, PPOPolicy, ActorPolicy, RandomPolicy, GreedyPolicy
  • Agent: qnAgent, 
  • Train network: Qnetwork
  • Dataset
  • TFEnvironment(Parallel environment)
  • Driver.run
  • Agent.train()
  • Tfagent in github
  • DQN-Cartpole colab

 

Tensorflow probability

  • Learn known unknowns
  • VariationalGaussianProcess
  • Toolbox for probabilistic modeling
  • Edward2, markov chain, MCMC, Varitional inference
  • Bayesian methods for hackers

 

Model evaluator

  • Quality
  • Performance over time
  • Evaluate data + model
  • Model analysis tools
  • Modelvalidator: component of TFX
  • Avoid pushing models with degraded quality
  • Pusher: Block push model
  • Model deployment
  • Tensorflow serving: multi-tenancy, optimize gpu, low latency, request batching, traffic isolation, production ready, scale in minutes, dynamic version refresh
  • tensorRT optimization
  • Putting it all together again
  • One of TFX

 

TFX(Tensorflow extended)

  • An End to End ML Platform
  • Alphbet and Google platform support
  • Airbnb, Paybal, twitter
  • Lib is low level 
  • Component is high level: well defined configuration
  • Metadata store: data dependency
  • Definitions of artifacts and their properties
  • Execution recores of components
  • Lineage tracking across all executions
  • Airflow, kubefolw pipelines

 

Julia

  • Very fast than  python, near c
  • Multiple dispatch
  • Tensorflow.jl
  • Graph mode available
  • Macro mode
  • Tensorboard 
  • Comparable with TF ecosystem
  • Custom operation
  • Will support keras

 

TF lattice

  • Predict if someon will default
  • TF Lattice model: gurantees monotonicity you ask for
  • Interpolated look-up tables
  • The flexibility you need
  • Plu and play
  • Lattice regression

 

Unicode

  • Ragged tensors.
  • Variabley-shaped sequences  of data
  • Iregular shapes
  • Dta agnostic
  • Unicode spilit add
  • Reshape, conv, embedding

 

Teachable machine learning

  • Posenet

 

Exoscale DL

  • Extreme weather phenomena
  • Climate data anlytics
  • Opportunities fo high fidelity ananlytics

 

Mesh-Tensorflow

  • Synchronous data-parallelism is typical
  • Giant language models
  • Translation, question-answering, dalog
  • Infinite trainigdata freely available
  • Quality improves with modeel size
  • Transformer LM-100M parameters
  • LM 1B parameter
  • Model parallelism
  • Can train giant models
  • Potentially low latency
  • Tricky to design efficient algorithms
  • Giant, unwieldy 
  • Spatial split
  • Set of similar processors
  • TPU-pod, multi-GPU, multi-CPU
  • Logical n-dimension
  • Model-parallelism-spolit different dimension
  • Data and model parallelism on 2D Mesh
  • Mesh dimension

 

Sonnet

  • Deepmind’s library for construting NN
  • Transitioning to TF
  • Transition to TF 2.0
  • Multiple forward

 



'Tensorflow' 카테고리의 다른 글

ms word python 에서 사용하기  (0) 2017.03.12
네이버 뉴스 크롤링하기 (퍼옴)  (0) 2017.03.05
Tensorflow 소스 코드 분석  (0) 2017.02.18
ReLU (Rectified Linear Unit)  (0) 2017.02.18
Machine learning 종류  (0) 2017.02.12