| 000 | 00000nam u2200205 a 4500 | |
| 001 | 000045958476 | |
| 005 | 20181024154809 | |
| 008 | 181024s2018 nyua b 001 0 eng d | |
| 020 | ▼a 9781617293870 | |
| 040 | ▼a 211009 ▼c 211009 ▼d 211009 | |
| 082 | 0 4 | ▼a 006.31 ▼2 23 |
| 084 | ▼a 006.31 ▼2 DDCK | |
| 090 | ▼a 006.31 ▼b S562m | |
| 100 | 1 | ▼a Shukla, Nishant. |
| 245 | 1 0 | ▼a Machine learning with TensorFlow / ▼c Nishant Shukla ; Kenneth Fricklas, Senoir Technical Editor. |
| 260 | ▼a Shelter Island, NY : ▼b Manning, ▼c c2018. | |
| 300 | ▼a xx, 251 p. : ▼b ill. ; ▼c 24 cm. | |
| 504 | ▼a Includes bibliographical references and index. | |
| 630 | 0 0 | ▼a TensorFlow (Electronic resource). |
| 650 | 0 | ▼a Machine learning. |
| 650 | 0 | ▼a Artificial intelligence. |
| 700 | 1 | ▼a Kenneth Fricklas. |
| 945 | ▼a KLPA |
소장정보
| No. | 소장처 | 청구기호 | 등록번호 | 도서상태 | 반납예정일 | 예약 | 서비스 |
|---|---|---|---|---|---|---|---|
| No. 1 | 소장처 과학도서관/Sci-Info(2층서고)/ | 청구기호 006.31 S562m | 등록번호 121246306 (6회 대출) | 도서상태 대출가능 | 반납예정일 | 예약 | 서비스 |
컨텐츠정보
책소개
Summary
Machine Learning with TensorFlow gives readers a solid foundation in machine-learning concepts plus hands-on experience coding TensorFlow with Python.
Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.
About the Technology
TensorFlow, Google's library for large-scale machine learning, simplifies often-complex computations by representing them as graphs and efficiently mapping parts of the graphs to machines in a cluster or to the processors of a single machine.
About the Book
Machine Learning with TensorFlow gives readers a solid foundation in machine-learning concepts plus hands-on experience coding TensorFlow with Python. You'll learn the basics by working with classic prediction, classification, and clustering algorithms. Then, you'll move on to the money chapters: exploration of deep-learning concepts like autoencoders, recurrent neural networks, and reinforcement learning. Digest this book and you will be ready to use TensorFlow for machine-learning and deep-learning applications of your own.
What's Inside
- Matching your tasks to the right machine-learning and deep-learning approaches
- Visualizing algorithms with TensorBoard
- Understanding and using neural networks
About the Reader
Written for developers experienced with Python and algebraic concepts like vectors and matrices.
About the Author
Author Nishant Shukla is a computer vision researcher focused on applying machine-learning techniques in robotics.
Senior technical editor, Kenneth Fricklas, is a seasoned developer, author, and machine-learning practitioner.
Table of Contents
- PART 1 - YOUR MACHINE-LEARNING RIG
- A machine-learning odyssey
- TensorFlow essentials PART 2 - CORE LEARNING ALGORITHMS
- Linear regression and beyond
- A gentle introduction to classification
- Automatically clustering data
- Hidden Markov models PART 3 - THE NEURAL NETWORK PARADIGM
- A peek into autoencoders
- Reinforcement learning
- Convolutional neural networks
- Recurrent neural networks
- Sequence-to-sequence models for chatbots
- Utility landscape
정보제공 :
저자소개
니샨트 수클라(지은이)
UCLA의 박사 과정 연구원으로 로보틱스 분야에서 머신러닝과 컴퓨터비전 기술을 연구합니다. 버지니아 대학교에서 컴퓨터 사이언스와 수학을 전공하였으며 Hack.UVA의 창립 멤버입니다. 하스켈 언어를 강의하여 주목을 받았습니다. 마이크로소프트, 페이스북, 포스퀘어에서 개발자로, 스페이스X에서 머신러닝 엔지니어로 일한 바 있으며, 『하스켈 데이터 분석 쿡북』(http://haskelldata.com)의 저자이기도 합니다. 분석화학에서 자연어 처리에 이르기까지 다양한 주제로 연구 논문을 썼습니다(http://mng.bz/e9sk). 보드게임 카탄(Settlers of Catan)과 카드게임 궨트(Gwent)를 즐겨 합니다.
목차
Preface p. xiii
Acknowledgments p. xv
About this book p. xvii
About the author p. xix
About the cover p. xx
Part 1 Your Machine-Learning Rig p. 1
1 A machine-learning odyssey p. 3
1.1 Machine-learning fundamentals p. 5
Parameters p. 7
Learning and inference p. 8
1.2 Data representation and features p. 9
1.3 Distance metrics p. 15
1.4 Types of learning p. 17
Supervised learning p. 17
Unsupervised learning p. 19
Reinforcement learning p. 19
1.5 TensorFlow p. 21
1.6 Overview of future chapters p. 22
1.7 Summary p. 24
2 TensorFlow essentials p. 25
2.1 Ensuring that TensorFlow works p. 27
2.2 Representing tensors p. 28
2.3 Creating operators p. 32
2.4 Executing operators with sessions p. 34
Understanding code as a graph p. 35
Setting session configurations p. 36
2.5 Writing code in Jupyter p. 38
2.6 Using variables p. 41
2.7 Saving and loading variables p. 43
2.8 Visualizing data using TensorBoard p. 44
Implementing a moving average p. 44
Visualizing the moving average p. 46
2.9 Summary p. 49
Part 2 Core Learning Algorithms p. 51
3 Linear regression and beyond p. 53
3.1 Formal notation p. 54
How do you know the regression algorithm is working? p. 57
3.2 Linear regression p. 59
3.3 Polynomial model p. 62
3.4 Regularization p. 65
3.5 Application of linear regression p. 69
3.6 Summary p. 70
4 A gentle introduction to classification p. 71
4.1 Formal notation p. 73
4.2 Measuring performance p. 75
Accuracy p. 75
Precision and recall p. 76
Receiver operating characteristic curve p. 77
4.3 Using linear regression for classification p. 78
4.4 Using logistic regression p. 83
Solving one-dimensional logistic regression p. 84
Solving two-dimensioned logistic regression p. 87
4.5 Multiclass classifier p. 90
One-versus-all p. 91
One-versus-one p. 92
Softmax regression p. 92
4.6 Application of classification p. 96
4.7 Summary p. 97
5 Automatically clustering data p. 99
5.1 Traversing files in TensorFlow p. 100
5.2 Extracting features from audio p. 102
5.3 K-means clustering p. 106
5.4 Audio segmentation p. 109
5.5 Clustering using a sell-organizing map p. 112
5.6 Application of clustering p. 117
5.7 Summary p. 117
6 Hidden Markov models p. 119
6.1 Example of a not-so-interpretable model p. 121
6.2 Markov model p. 121
6.3 Hidden Markov model p. 124
6.4 Forward algorithm p. 125
6.5 Viterbi decoding p. 128
6.6 Uses of hidden Markov models p. 130
Modeling a video p. 130
Modeling DNA p. 130
Madding an image p. 130
6.7 Application of hidden Markov models p. 130
6.8 Summary p. 131
Part 3 Te Neural Network Paradigm p. 133
7 A peek into autoencoders p. 135
7.1 Neural networks p. 136
7.2 Autoencoders p. 140
7.3 Batch training p. 145
7.4 Working with images p. 146
7.5 Application of autoencoders p. 150
7.6 Summary p. 151
8 Reinforcement learning p. 153
8.1 Formal notions p. 155
Policy p. 156
Utility p. 157
8.2 Applying reinforcement learning p. 158
8.3 Implementing reinforcement learning p. 160
8.4 Exploring other applications of reinforcement learning p. 167
8.5 Summary p. 168
9 Convolutional neural networks p. 169
9.1 Drawback of neural networks p. 170
9.2 Convolutional neural networks p. 171
9.3 Preparing the image p. 173
Generating filters p. 176
Convolving using filters p. 178
Max pooling p. 181
9.4 Implementing a convolutional neural network in TensorFlow p. 182
Measuring performance p. 185
Training the classifier p. 186
9.5 Tips and tricks to improve performance p. 187
9.6 Application of convolutional neural networks p. 188
9.7 Summary p. 188
10 Recurrent neural networks p. 189
10.1 Contextual information p. 190
10.2 Introduction to recurrent neural networks p. 190
10.3 Implementing a recurrent neural network p. 192
10.4 A predictive model for time-series data p. 195
10.5 Application of recurrent neural networks p. 198
10.6 Summary p. 199
11 Sequence-to-sequence models for chatbots p. 201
11.1 Building on classification and RNNs p. 202
11.2 Seq2seq architecture p. 205
11.3 Vector representation of symbols p. 210
11.4 Putting it all together p. 212
11.5 Gathering dialogue data p. 220
11.6 Summary p. 222
12 Utility landscape p. 223
12.1 Preference model p. 226
12.2 Image embedding p. 231
12.3 Ranking images p. 234
12.4 Summary p. 239
12.5 What''s next? p. 239
Appendix Installation p. 241
Index p. 247
