| 000 | 00000nam u2200205 a 4500 | |
| 001 | 000045935549 | |
| 005 | 20250829161508 | |
| 008 | 180314s2018 nyua 001 0 eng d | |
| 020 | ▼a 9781617294433 | |
| 035 | ▼a (KERIS)BIB000014702280 | |
| 040 | ▼a 211048 ▼c 211048 ▼d 211009 | |
| 082 | 0 4 | ▼a 006.31 ▼2 23 |
| 084 | ▼a 006.31 ▼2 DDCK | |
| 090 | ▼a 006.31 ▼b C547d | |
| 100 | 1 | ▼a Chollet, François, ▼d 1957- ▼0 AUTH(211009)113297. |
| 245 | 1 0 | ▼a Deep learning with python / ▼c François Chollet. |
| 260 | ▼a Shelter Island, NY : ▼b Manning, ▼c c2018. | |
| 300 | ▼a xxi, 361 p. : ▼b ill. ; ▼c 24 cm. | |
| 500 | ▼a Includes index. | |
| 650 | 0 | ▼a Python (Computer program language). |
| 650 | 0 | ▼a Machine learning. |
| 650 | 0 | ▼a Neural networks (Computer science). |
| 945 | ▼a KLPA |
소장정보
| No. | 소장처 | 청구기호 | 등록번호 | 도서상태 | 반납예정일 | 예약 | 서비스 |
|---|---|---|---|---|---|---|---|
| No. 1 | 소장처 중앙도서관/서고6층/ | 청구기호 006.31 C547d | 등록번호 111787697 (15회 대출) | 도서상태 대출가능 | 반납예정일 | 예약 | 서비스 |
| No. 2 | 소장처 중앙도서관/서고6층/ | 청구기호 006.31 C547d | 등록번호 111810246 (7회 대출) | 도서상태 대출가능 | 반납예정일 | 예약 | 서비스 |
컨텐츠정보
책소개
Summary
Deep Learning with Python introduces the field of deep learning using the Python language and the powerful Keras library. Written by Keras creator and Google AI researcher Fran?ois Chollet, this book builds your understanding through intuitive explanations and practical examples. Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications. About the Technology Machine learning has made remarkable progress in recent years. We went from near-unusable speech and image recognition, to near-human accuracy. We went from machines that couldn't beat a serious Go player, to defeating a world champion. Behind this progress is deep learning--a combination of engineering advances, best practices, and theory that enables a wealth of previously impossible smart applications. About the Book Deep Learning with Python introduces the field of deep learning using the Python language and the powerful Keras library. Written by Keras creator and Google AI researcher Fran?ois Chollet, this book builds your understanding through intuitive explanations and practical examples. You'll explore challenging concepts and practice with applications in computer vision, natural-language processing, and generative models. By the time you finish, you'll have the knowledge and hands-on skills to apply deep learning in your own projects. What's Inside- Deep learning from first principles
- Setting up your own deep-learning environment
- Image-classification models
- Deep learning for text and sequences
- Neural style transfer, text generation, and image generation
About the Reader Readers need intermediate Python skills. No previous experience with Keras, TensorFlow, or machine learning is required. About the Author Fran?ois Chollet is one of the most important researchers in modern day deep learning. His groundbreaking work includes the creation of the Keras deep learning library, and major contributions to the TensorFlow framework. These tools have helped revolutionize and democratize deep learning. Fran?ois is an AI researcher and Senior Staff Software Engineer at Google. Fran?ois authored Deep Learning with R alongside J.J. Allaire, and developed the Abstraction and Reasoning Challenge that measures AI skill-acquisition on unknown tasks. Table of Contents
- What is deep learning?
- Before we begin: the mathematical building blocks of neural networks
- Getting started with neural networks
- Fundamentals of machine learning
- Deep learning for computer vision
- Deep learning for text and sequences
- Advanced deep-learning best practices
- Generative deep learning
- Conclusions
- appendix A - Installing Keras and its dependencies on Ubuntu
- appendix B - Running Jupyter notebooks on an EC2 GPU instance
PART 2 - DEEP LEARNING IN PRACTICE
From the Publisher For over thirty years, Manning Publications has been delivering impeccable quality in tech publishing. Our rich and independent history is filled with innovations, including groundbreaking early access programs, DRM-free ebooks, and live learning projects. We spend thousands of hours making each Manning book outstanding--and our readers agree! We're regularly told that Manning produces the very best tech content you can buy. Manning authors are technology experts, including distinguished academics, industry veterans, and the creators of major tools. Timeless Manning classics include Francois Chollet's Deep Learning with Python, Jon Skeet's C# in Depth, Don Jones' Learn Windows Powershell in a Month of Lunches, and Chris Richarson's Microservices Patterns. We're proud to help some of the world's greatest programmers share their unique insight with you.
정보제공 :
저자소개
목차
Preface p. xiii
Acknowledgments p. xv
About this book p. xvi
About the author p. xx
About the cover p. xxi
Part 1 Fundamentals of deep learning p. 1
1 What is deep learning? p. 3
1.1 Artificial intelligence, machine learning, and deep learning p. 4
Artificial intelligence p. 4
Machine learning p. 4
Learning representations from data p. 6
The "deep" in deep learning p. 8
Understanding how deep learning works, in three figures p. 9
What deep learning has achieved so far p. 11
Don''t believe the short-term hype p. 12
The promise of AI p. 13
1.2 Before deep learning: a brief history of machine learning p. 14
Probabilistic modeling p. 14
Early neural networks p. 14
Kernel methods p. 15
Decision trees, random forests, and gradient boosting machines p. 16
Back to neural networks p. 17
What makes deep learning different p. 17
The modern machine-learning landscape p. 18
1.3 Why deep learning? Why now? p. 20
Hardware p. 20
Data p. 21
Algorithms p. 21
A new wave of investment p. 22
The democratization of deep learning p. 23
Will it last? p. 23
2 Before we begin: the mathematical building blocks of neural networks p. 25
2.1 A first look at a neural network p. 27
2.2 Data representations for neural networks p. 31
Scalars (0D tensors) p. 31
Vectors (1D tensors) p. 31
Matrices (2D tensors) p. 31
3D tensors and higher-dimensional tensors p. 32
Key attributes p. 32
Manipulating tensors in Numpy p. 34
The notion of data batches p. 34
Real-world examples of data tensors p. 35
Vector data p. 35
Timeseries data or sequence data p. 35
Image data p. 36
Video data p. 37
2.3 The gears of neural networks: tensor operations p. 38
Element-wise operations p. 38
Broadcasting p. 39
Tensor dot p. 40
Tensor reshaping p. 42
Geometric interpretation of tensor operations p. 43
A geometric interpretation of deep learning p. 44
2.4 The engine of neural networks: gradient-based optimization p. 46
What''s a derivative? p. 47
Derivative of a tensor operation: the gradient p. 48
Stochastic gradient descent p. 48
Chaining derivatives: the Backpropagation algorithm p. 51
2.5 Looking back at our first example p. 53
2.6 Chapter summary p. 55
3 Getting started with neural networks p. 56
3.1 Anatomy of a neural network p. 58
Layers: the building blocks of deep learning p. 58
Models: networks of layers p. 59
Loss functions and optimizers: keys to configuring the learning process p. 60
3.2 Introduction to Keras p. 61
Keras, TensorFlow, Theano, and CNTK p. 62
Developing with Keras: a quick overview p. 62
3.3 Setting up a deep-learning workstation p. 65
Jupyter notebooks: the preferred way to run deep-learning experiments p. 65
Getting Keras running: two options p. 66
Running deep-learning jobs in the cloud: pros and cons p. 66
What is the best GPU for deep learning? p. 66
3.4 Classifying movie reviews: a binary classification example p. 68
The IMDB dataset p. 68
Preparing the data p. 69
Building your network p. 70
Validating your approach p. 73
Using a trained network to generate predictions on new data p. 76
Further experiments p. 77
Wrapping up p. 77
3.5 Classifying newswires: a multiclass classification example p. 78
The Reuters dataset p. 78
Preparing the data p. 79
Building your network p. 79
Validating your approach p. 80
Generating predictions on new data p. 83
A different way to handle the labels and the loss p. 83
The importance of having sufficiently large intermediate layers p. 83
Further experiments p. 84
Wrapping up p. 84
3.6 Predicting house prices: a regression example p. 85
The Boston Housing Price dataset p. 85
Preparing the data p. 86
Building your network p. 86
Validating your approach using K-fold validation p. 87
Wrapping up p. 91
3.7 Chapter summary p. 92
4 Fundamentals of machine learning p. 93
4.1 Four branches of machine learning p. 94
Supervised learning p. 94
Unsupervised learning p. 94
Self-supervised learning p. 94
Reinforcement learning p. 95
4.2 Evaluating machine-learning models p. 97
Training validation, and test sets p. 97
Things to keep in mind p. 100
4.3 Data preprocessing, feature engineering, and feature learning p. 101
Data preprocessing for neural networks p. 101
Feature engineering p. 102
4.4 Overfitting and underfitting p. 104
Reducing the network''s size p. 104
Adding weight regularization p. 107
Adding dropout p. 109
4.5 The universal workflow of machine learning p. 111
Defining the problem and assembling a dataset p. 111
Choosing a measure of success p. 112
Deciding on an evaluation protocol p. 112
Preparing your data p. 112
Developing a model that does better than a baseline p. 113
Scaling up: developing a model that overfits p. 114
Regularizing your model and luning your hyperparameters p. 114
4.6 Chapter summary p. 116
Part 2 Deep Learning in Practice p. 117
5 Deep learning for computer vision p. 119
5.1 Introduction to convnets p. 120
The convolution operation p. 122
The max-pooling operation p. 127
5.2 Training a convnet from scratch on a small dataset p. 130
The relevance of deep learning for small-data problems p. 130
Downloading the data p. 131
Building your network p. 133
Data preprocessing p. 135
Using data augmentation p. 138
5.3 Using a pretrained convnet p. 143
Feature extraction p. 143
Fine-luning p. 152
Wrapping up p. 159
5.4 Visualizing what convnets learn p. 160
Visualizing intermediate activations p. 160
Visualizing convnet filters p. 167
Visualizing heatmaps of class activation p. 172
5.5 Chapter summary p. 177
6 Deep learning for text and sequences p. 178
6.1 Working with text data p. 180
One-hot encoding of words and characters p. 181
Using word embeddings p. 184
Putting it all together: from raw text to word embeddings p. 188
Wrapping up p. 195
6.2 Understanding recurrent neural networks p. 196
A recurrent layer in Keras p. 198
Understanding the LSTM and GRU layers p. 202
A concrete LSTM example in Keras p. 204
Wrapping up p. 206
6.3 Advanced use of recurrent neural networks p. 207
A temperature-forecasting problem p. 207
Preparing the data p. 210
A common-sense, non-machine-learning baseline p. 212
A basic machine-learning approach p. 213
A first recurrent baseline p. 215
Using recurrent dropout to fight overfitting p. 216
Stacking recurrent layers p. 217
Using bidirectional RNNs p. 219
Going even further p. 222
Wrapping up p. 223
6.4 Sequence processing with convnets p. 225
Understanding 1D convolution for sequence data p. 225
1D pooling for sequence data p. 226
Implementing a 1D convnet p. 226
Combining CNNs and RNNs to process long sequences p. 228
Wrapping up p. 231
6.5 Chapter summary p. 232
7 Advanced deep-learning best practices p. 233
7.1 Going beyond the Sequential model: the Keras functional API p. 234
Introduction to the functional API p. 236
Multi-input models p. 238
Multi-output models p. 240
Directed acyclic graphs of layers p. 242
Layer weight sharing p. 246
Models as layers p. 247
Wrapping up p. 248
7.2 Inspecting and monitoring deep-learning models using Keras callbacks and TensorBoard p. 249
Using callbacks to act on a model during training p. 249
Introduction to TensorBoard: the TensorFlow visualization framework p. 252
Wrapping up p. 259
7.3 Getting the most out of your models p. 260
Advanced architecture patterns p. 260
Hyperparameter optimization p. 263
Model ensembling p. 264
Wrapping up p. 266
7.4 Chapter summary p. 268
8 Generative deep learning p. 269
8.1 Text generation with LSTM p. 271
A brief history of generative recurrent networks p. 271
How do you generate sequence data? p. 272
The importance of the sampling strategy p. 272
Implementing character-level LSTM text generation p. 274
Wrapping up p. 279
8.2 DeepDream p. 280
Implementing DeepDream in Keras p. 281
Wrapping up p. 286
8.3 Neural style transfer p. 287
The content loss p. 288
The style loss p. 288
Neural style transfer in Keras p. 289
Wrapping up p. 295
8.4 Generating images with variational autoencoders p. 296
Sampling from latent spaces of images p. 296
Concept vectors for image editing p. 297
Variational autoencoders p. 298
Wrapping up p. 304
8.5 Introduction to generative adversarial networks p. 305
A schematic GAN implementation p. 307
A bag of tricks p. 307
The generator p. 308
The discriminator p. 309
The adversarial network p. 310
How to train your DCGAN p. 310
Wrapping up p. 312
8.6 Chapter summary p. 313
9 Conclusions p. 314
9.1 Key concepts in review p. 315
Various approaches to AI p. 315
What makes deep learning special within the field of machine learning p. 315
How to think about deep learning p. 316
Key enabling technologies p. 317
The universal machine-learning workflow p. 318
Key network architectures p. 319
The space of possibilities p. 322
9.2 The limitations of deep learning p. 325
The risk of anthropomorphizing machine-learning models p. 325
Local generalization vs. extreme generalization p. 327
Wrapping up p. 329
9.3 The future of deep learning p. 330
Models as programs p. 330
Beyond backpropagation and differentiable layers p. 332
Automated machine learning p. 332
Lifelong learning and modular subroutine reuse p. 333
The long-term vision p. 335
9.4 Staying up to date in a fast-moving field p. 337
Practice on real-world problems using Kaggle p. 337
Read about the latest developments on arXiv p. 337
Explore the Kerns ecosystem p. 338
9.5 Final words p. 339
Appendix A Installing Keras and its dependencies on Ubuntu p. 340
Appendix B Running Jupyter notebooks on an EC2 GPU instance p. 345
Index p. 353
