HOME > 상세정보

상세정보

Deep learning with Keras : implement neural networks with Keras on Theano and TensorFlow

Deep learning with Keras : implement neural networks with Keras on Theano and TensorFlow (8회 대출)

자료유형
단행본
개인저자
Gulli, Antonio. Pal, Sujit.
서명 / 저자사항
Deep learning with Keras : implement neural networks with Keras on Theano and TensorFlow / Antonio Gulli, Sujit Pal.
발행사항
Birmingham :   Mumbai Packt Publishing,   c2017.  
형태사항
303 p. : ill. ; 24 cm.
ISBN
9781787128422
서지주기
Includes bibliographical references and index.
000 00000nam u2200205 a 4500
001 000045919628
005 20171102173620
008 171102s2017 enka b 001 0 eng d
020 ▼a 9781787128422
040 ▼a 211009 ▼c 211009 ▼d 211009
082 0 4 ▼a 006.31 ▼2 23
084 ▼a 006.31 ▼2 DDCK
090 ▼a 006.31 ▼b G973d
100 1 ▼a Gulli, Antonio.
245 1 0 ▼a Deep learning with Keras : ▼b implement neural networks with Keras on Theano and TensorFlow / ▼c Antonio Gulli, Sujit Pal.
260 ▼a Birmingham : ▼b Mumbai Packt Publishing, ▼c c2017.
300 ▼a 303 p. : ▼b ill. ; ▼c 24 cm.
504 ▼a Includes bibliographical references and index.
700 1 ▼a Pal, Sujit.
945 ▼a KLPA

소장정보

No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 과학도서관/Sci-Info(2층서고)/ 청구기호 006.31 G973d 등록번호 121242243 (8회 대출) 도서상태 대출가능 반납예정일 예약 서비스 B M

컨텐츠정보

책소개

Publisher's Note: This edition from 2017 is outdated and is not compatible with TensorFlow 2 or any of the most recent updates to Python libraries. A new second edition, updated for 2020 and featuring TensorFlow 2, the Keras API, CNNs, GANs, RNNs, NLP, and AutoML, has now been published.


Key Features:

  • Implement various deep learning algorithms in Keras and see how deep learning can be used in games
  • See how various deep learning models and practical use-cases can be implemented using Keras
  • A practical, hands-on guide with real-world examples to give you a strong foundation in Keras


Book Description:

This book starts by introducing you to supervised learning algorithms such as simple linear regression, the classical multilayer perceptron and more sophisticated deep convolutional networks. You will also explore image processing with recognition of handwritten digit images, classification of images into different categories, and advanced objects recognition with related image annotations. An example of identification of salient points for face detection is also provided.


Next you will be introduced to Recurrent Networks, which are optimized for processing sequence data such as text, audio or time series. Following that, you will learn about unsupervised learning algorithms such as Autoencoders and the very popular Generative Adversarial Networks (GANs). You will also explore non-traditional uses of neural networks as Style Transfer.


Finally, you will look at reinforcement learning and its application to AI game playing, another popular direction of research and application of neural networks.


What You Will Learn:

  • Optimize step-by-step functions on a large neural network using the Backpropagation algorithm
  • Fine-tune a neural network to improve the quality of results
  • Use deep learning for image and audio processing
  • Use Recursive Neural Tensor Networks (RNTNs) to outperform standard word embedding in special cases
  • Identify problems for which Recurrent Neural Network (RNN) solutions are suitable
  • Explore the process required to implement Autoencoders
  • Evolve a deep neural network using reinforcement learning


Who this book is for:

If you are a data scientist with experience in machine learning or an AI programmer with some exposure to neural networks, you will find this book a useful entry point to deep-learning with Keras. A knowledge of Python is required for this book.


정보제공 : Aladin

저자소개

안토니오 걸리(지은이)

혁신과 실행에 있어 전체적 기술과 관리를 구축하는 데 열정을 갖고 있다. 핵심 전문 분야는 클라우드 컴퓨팅, 딥러닝과 검색엔진이다. 현재 스위스 취리히의 Google 클라우드 오피스 CTO로 재직 중이며 검색, 클라우드 인프라, 데이터 독립 대화형 AI를 연구하고 있다. 이전에는 EMEA의 CTO 사무실에서 근무했다. Google 바르샤바에서 관리자로 일하는 동안 GCE, 쿠버네티스, 서버리스, 보르그, 콘솔에서 클라우드 관리 팀에 집중하며 450명이 넘는 엔지니어 집단으로 성장시켰다. 지금까지 운 좋게 유럽 4개국에서 전문적인 경험을 얻을 수 있었고 EMEA의 6개국과 미국에서 팀을 관리했다. ◆ 암스테르담의 주요 과학 출판사인 Elsevier에서 부사장으로서 과학 출판을 이끌었다. ◆ 런던에서는 Microsoft Ask.com의 CTO로서 Bing 검색 작업을 수행하는 엔지니어링 사이트 책임자로 일했다. ◆ 이탈리아와 영국에서는 Ask.com 유럽의 CTO였다. ◆ 폴란드, 영국, 스위스에서는 Google에 근무했다. 검색, 스마트 에너지, 환경, AI에서 공동 발명한 수많은 기술이 있으며 11개 특허가 등록(21개 출원)됐고 코딩과 머신러닝에 관한 다수의 책을 저술했으며 이는 일본어와 중국어로도 번역됐다.

수짓 팔(지은이)

Reed-Elsevier 그룹 내 고급 기술 그룹인 Elsevier Labs의 기술 연구 이사다. 관심 분야는 문맥 검색, 자연어 처리, 머신러닝, 딥러닝이다. Elsevier에서 여러 머신러닝 이니셔티브(initiatives)를 수행했는데 검색 품질 측정과 개선, 이미지 분류와 중복 탐지, 어노테이션, 의학과 과학 말뭉치에 대한 온톨로지 개발 등을 수행했다.

정보제공 : Aladin

목차

CONTENTS
Preface = 1
Chapter 1 : Neural Networks Foundations = 9
 Perceptron = 11
  The first example of Keras code = 11
 Multilayer perceptron - the first example of a network = 12
  Problems in training the perceptron and a solution = 13
  Activation function - sigmoid = 14
  Activation function - ReLU = 15
  Activation functions = 15
 A real example - recognizing handwritten digits = 16
  One-hot encoding - OHE = 17
  Defining a simple neural net in Keras = 17
  Running a simple Keras net and establishing a baseline = 21
  Improving the simple net in Keras with hidden layers = 22
  Further improving the simple net in Keras with dropout = 25
  Testing different optimizers in Keras = 28
  Increasing the number of epochs = 34
  Controlling the optimizer learning rate = 34
  Increasing the number of internal hidden neurons = 35
  Increasing the size of batch computation = 37
  Summarizing the experiments run for recognizing handwritten charts = 37
  Adopting regularization for avoiding overfitting = 38
  Hyperparameters tuning = 40
  Predicting output = 40
 A practical overview of backpropagation = 40
 Towards a deep learning approach = 42
 Summary = 43
Chapter 2 : Keras Installation and API = 45
 Installing Keras = 46
  Step 1 - install some useful dependencies = 46
  Step 2 - install Theano = 47
  Step 3 - install TensorFiow = 47
  Step 4 - install Keras = 48
  Step 5 - testing Theano, TensorFiow, and Keras = 48
 Configuring Keras = 49
 Installing Keras on Docker = 50
 Installing Keras on Google Cloud ML = 53
 Installing Keras on Amazon AWS = 56
 Installing. Keras on Microsoft Azure = 58
 Keras API = 60
  Getting started with Keras architecture = 60
  An overview of predefined neural network layers = 61
  An overview of predefined activation functions = 64
  An overview of losses functions = 65
  An overview of metrics = 66
  An overview of optimizers = 66
  Some useful operations = 66
  Saying and loading the weights and the architecture of a model = 66
 Callbacks for customizing the training process = 67
  Checkpointing = 68
  Using TensorBoard and Keras = 69
  Using Quiver and Keras = 70
 Summary = 71
Chapter 3 : Deep Learning with ConvNets = 73
 Deep convolutional neural network - DCNN = 74
  Local receptive fields = 74
  Shared weights and bias = 75
  Pooling layers = 76
 An example of DCNN - LeNet = 78
  LeNet code in Keras = 78
  Understanding the power of deep learning = 85
 Recognizing CIFAR-10 images with deep learning = 86
  Improving the CIFAR-10 performance with deeper a network = 91
  Improving the CIFAR-10 performance with data augmentation = 93
  Predicting with CIFAR-10 = 97
 Very deep convolutional networks for large scale image recognition = 98
  Recognizing cats with a VGG-16 net = 99
  Utilizing Keras built-in VGG-16 net module = 100
  Recycling pre-built deep learning models for extracting features = 102
  Very deep inception-v3 net used for transfer learning = 103
 Summary = 106
Chapter 4 : Generative Adversarial Networks and WaveNet = 107
 What is a GAN? = 108
  Some GAN applications = 110
 Deep convolutional generative adversarial networks = 114
 Keras adversarial GANs for forging MNIST = 117
 Keras adversarial GANs for forging CIFAR = 124
 WaveNet - a generative model for learning how to produce audio = 132
 Summary = 141
Chapter 5 : Word Embeddings = 143
 Distributed representations = 144
 word2vec = 145
  The skip-gram word2vec model = 146
  The CBOW word2vec model = 150
  Extracting word2vec embeddings from the model = 152
  Using third-party implementations of word2vec = 155
 Exploring GloVe = 159
 Using pre-trained embeddings = 161
  Learn embeddings from scratch = 162
  Fine-tuning learned embeddings from word2vec = 167
  Fine-tune learned embeddings from GloVe = 171
  Look up embeddings = 172
 Summary = 176
Chapter 6 : Recurrent Neural Network - RNN = 179
 SimpleRNN cells = 180
  SimpleRNN with Keras - generating text = 182
 RNN topologies = 187
 Vanishing and exploding gradients = 188
 Long short term memory - LSTM = 191
  LSTM with Keras - sentiment analysis = 193
 Gated recurrent unit - GRU = 200
  GRU with Keras - POS tagging = 202
 Bidirectional RNNs = 209
 Stateful RNNs = 210
  Stateful LSTM with Keras - predicting electricity consumption = 210
 Other RNN variants = 217
 Summary = 218
Chapter 7 : Additional Deep Learning Models = 219
 Keras functional API = 221
 Regression networks = 223
  Keras regression example - predicting benzene levels in the air = 224
 Unsupervised learning - autoencoders = 228
  Keras autoencoder example - sentence vectors = 230
 Composing deep networks = 239
  Keras example - memory network for question answering = 240
 Customizing Keras = 247
  Keras example - using the lambda layer = 248
  Keras example - building a custom normalization layer = 249
 Generative models = 252
  Keras example - deep dreaming = 252
  Keras example - style transfer = 261
 Summary = 267
Chapter 8 : AI Game Playing = 269
 Reinforcement learning = 270
  Maximizing future rewards = 271
  Q-learning = 272
  The deep Q-network as a Q-function = 273
  Balancing exploration with exploitation = 275
  Experience replay, or the value of experience = 276
 Example - Keras deep Q-network for catch = 276
 The road ahead = 289
 Summary = 291
Appendix : Conclusion = 293
 Keras 2.0 - what is new = 295
  Installing Keras 2.0 = 295
  API changes = 296
Index = 299

관련분야 신착자료

Dyer-Witheford, Nick (2026)
양성봉 (2025)