본문 바로가기

리디 접속이 원활하지 않습니다.
강제 새로 고침(Ctrl + F5)이나 브라우저 캐시 삭제를 진행해주세요.
계속해서 문제가 발생한다면 리디 접속 테스트를 통해 원인을 파악하고 대응 방법을 안내드리겠습니다.
테스트 페이지로 이동하기

Advanced Natural Language Processing with TensorFlow 2 상세페이지

Advanced Natural Language Processing with TensorFlow 2

Build effective real-world NLP applications using NER, RNNs, seq2seq models, Transformers, and more

  • 관심 0
소장
전자책 정가
21,000원
판매가
21,000원
출간 정보
  • 2021.02.04 전자책 출간
듣기 기능
TTS(듣기) 지원
파일 정보
  • PDF
  • 381 쪽
  • 7.3MB
지원 환경
  • PC뷰어
  • PAPER
ISBN
9781800201057
ECN
-
Advanced Natural Language Processing with TensorFlow 2

작품 정보

One-stop solution for NLP practitioners, ML developers, and data scientists to build effective NLP systems that can perform real-world complicated tasks

▶Book Description
Recently, there have been tremendous advances in NLP, and we are now moving from research labs into practical applications. This book comes with a perfect blend of both the theoretical and practical aspects of trending and complex NLP techniques.

The book is focused on innovative applications in the field of NLP, language generation, and dialogue systems. It helps you apply the concepts of pre-processing text using techniques such as tokenization, parts of speech tagging, and lemmatization using popular libraries such as Stanford NLP and SpaCy. You will build Named Entity Recognition (NER) from scratch using Conditional Random Fields and Viterbi Decoding on top of RNNs.

The book covers key emerging areas such as generating text for use in sentence completion and text summarization, bridging images and text by generating captions for images, and managing dialogue aspects of chatbots. You will learn how to apply transfer learning and fine-tuning using TensorFlow 2.

Further, it covers practical techniques that can simplify the labelling of textual data. The book also has a working code that is adaptable to your use cases for each tech piece.

By the end of the book, you will have an advanced knowledge of the tools, techniques and deep learning architecture used to solve complex NLP problems.

▶What You Will Learn
-Grasp important pre-steps in building NLP applications like POS tagging
-Use transfer and weakly supervised learning using libraries like Snorkel
-Do sentiment analysis using BERT
-Apply encoder-decoder NN architectures and beam search for summarizing texts
-Use Transformer models with attention to bring images and text together
-Build apps that generate captions and answer questions about images using custom Transformers
-Use advanced TensorFlow techniques like learning rate annealing, custom layers, and custom loss functions to build the latest DeepNLP models

▶Key Features
-Apply deep learning algorithms and techniques such as BiLSTMS, CRFs, BPE and more using TensorFlow 2
-Explore applications like text generation, summarization, weakly supervised labelling and more
-Read cutting edge material with seminal papers provided in the GitHub repository with full working code

▶Who This Book Is For
This is not an introductory book and assumes the reader is familiar with basics of NLP and has fundamental Python skills, as well as basic knowledge of machine learning and undergraduate-level calculus and linear algebra.</p><p>The readers who can benefit the most from this book include intermediate ML developers who are familiar with the basics of supervised learning and deep learning techniques and professionals who already use TensorFlow/Python for purposes such as data science, ML, research, analysis, etc.

▶What this book covers
- Chapter 1, Essentials of NLP, provides an overview of various topics in NLP such as tokenization, stemming, lemmatization, POS tagging, vectorization, etc. An overview of common NLP libraries like spaCy, Stanford NLP, and NLTK, with their key capabilities and use cases, will be provided. We will also build a simple classifier for spam.

- Chapter 2, Understanding Sentiment in Natural Language with BiLSTMs, covers the NLU use case of sentiment analysis with an overview of Recurrent Neural Networks (RNNs), LSTMs, and BiLSTMs, which are the basic building blocks of modern NLP models. We will also use tf.data for efficient use of CPUs and GPUs to speed up data pipelines and model training.

- Chapter 3, Named Entity Recognition (NER) with BiLSTMs, CRFs, and Viterbi Decoding, focuses on the key NLU problem of NER, which is a basic building block of taskoriented chatbots. We will build a custom layer for CRFs for improving the accuracy of NER and the Viterbi decoding scheme, which is often applied to a deep model to improve the quality of the output.

- Chapter 4, Transfer Learning with BERT, covers a number of important concepts in modern deep NLP such as types of transfer learning, pre-trained embeddings, an overview of Transformers, and BERT and its application in improving the sentiment analysis task introduced in Chapter 2, Understanding Sentiment in Natural Language with BiLSTMs.

- Chapter 5, Generating Text with RNNs and GPT-2, focuses on generating text with a custom character-based RNN and improving it with Beam Search. We will also cover the GPT-2 architecture and touch upon GPT-3.

- Chapter 6, Text Summarization with Seq2seq Attention and Transformer Networks, takes on the challenging task of abstractive text summarization. BERT and GPT are two halves of the full encoder-decoder model. We put them together to build a seq2seq model for summarizing news articles by generating headlines for them. How ROUGE metrics are used for the evaluation of summarization is also covered.

- Chapter 7, Multi-Modal Networks and Image Captioning with ResNets and Transformers, combines computer vision and NLP together to see if a picture is indeed worth a thousand words! We will build a custom Transformer model from scratch and train it to generate captions for images.

- Chapter 8, Weakly Supervised Learning for Classification with Snorkel, focuses on a key problem–labeling data. While NLP has a lot of unlabeled data, labeling it is quite an expensive task. This chapter introduces the snorkel library and shows how massive amounts of data can be quickly labeled.

- Chapter 9, Building Conversational AI Applications with Deep Learning, combines the various techniques covered throughout the book to show how different types of chatbots, such as question-answering or slot-filling bots, can be built.

- Chapter 10, Installation and Setup Instructions for Code, walks through all the instructions required to install and configure a system for running the code supplied with the book.

작가 소개

▶About the Author
- Ashish Bansal
Ashish is an AI/ML leader, a well-known speaker, and an astute technologist with over 20 years of experience in the field. He has a Bachelor's in technology from IIT BHU, and an MBA in marketing from Kellogg School of Management. He is currently the Director of Recommendations at Twitch where he works on building scalable recommendation systems across a variety of product surfaces, connecting content to people. He has worked on recommendation systems at multiple organizations, most notably Twitter where he led Trends and Events recommendations and at Capital One where he worked on B2B and B2C products. Ashish was also a co-founder of GALE Partners, a full-service digital agency in Toronto, and spent over 9 years at SapientNitro, a leading digital agency.

리뷰

0.0

구매자 별점
0명 평가

이 작품을 평가해 주세요!

건전한 리뷰 정착 및 양질의 리뷰를 위해 아래 해당하는 리뷰는 비공개 조치될 수 있음을 안내드립니다.
  1. 타인에게 불쾌감을 주는 욕설
  2. 비속어나 타인을 비방하는 내용
  3. 특정 종교, 민족, 계층을 비방하는 내용
  4. 해당 작품의 줄거리나 리디 서비스 이용과 관련이 없는 내용
  5. 의미를 알 수 없는 내용
  6. 광고 및 반복적인 글을 게시하여 서비스 품질을 떨어트리는 내용
  7. 저작권상 문제의 소지가 있는 내용
  8. 다른 리뷰에 대한 반박이나 논쟁을 유발하는 내용
* 결말을 예상할 수 있는 리뷰는 자제하여 주시기 바랍니다.
이 외에도 건전한 리뷰 문화 형성을 위한 운영 목적과 취지에 맞지 않는 내용은 담당자에 의해 리뷰가 비공개 처리가 될 수 있습니다.
아직 등록된 리뷰가 없습니다.
첫 번째 리뷰를 남겨주세요!
'구매자' 표시는 유료 작품 결제 후 다운로드하거나 리디셀렉트 작품을 다운로드 한 경우에만 표시됩니다.
무료 작품 (프로모션 등으로 무료로 전환된 작품 포함)
'구매자'로 표시되지 않습니다.
시리즈 내 무료 작품
'구매자'로 표시되지 않습니다. 하지만 같은 시리즈의 유료 작품을 결제한 뒤 리뷰를 수정하거나 재등록하면 '구매자'로 표시됩니다.
영구 삭제
작품을 영구 삭제해도 '구매자' 표시는 남아있습니다.
결제 취소
'구매자' 표시가 자동으로 사라집니다.

개발/프로그래밍 베스트더보기

  • AI 에이전트 인 액션 (마이클 래넘, 류광)
  • 코드 너머, 회사보다 오래 남을 개발자 (김상기, 배문교)
  • 생성형 AI를 위한 프롬프트 엔지니어링 (제임스 피닉스, 마이크 테일러)
  • 핸즈온 LLM (제이 알아마르, 마르턴 흐루턴도르스트)
  • 모던 소프트웨어 엔지니어링 (데이비드 팔리, 박재호)
  • 객체지향 시스템 디자인 원칙 (마우리시오 아니체, 오현석)
  • 시스템 설계 면접 완벽 가이드 (지용 탄, 나정호)
  • 요즘 우아한 AI 개발 (우아한형제들)
  • 최고의 프롬프트 엔지니어링 강의 (김진중)
  • 개정2판 | 시작하세요! 도커/쿠버네티스 (용찬호)
  • 테디노트의 랭체인을 활용한 RAG 비법노트_기본편 (이경록(테디노트))
  • 핸즈온 생성형 AI (오마르 산세비에로, 페드로 쿠엥카)
  • 개정2판 | 파인만의 컴퓨터 강의 (리처드 파인만, 서환수)
  • LLM 인 프로덕션 (크리스토퍼 브루소, 매슈 샤프)
  • 랭체인과 랭그래프로 구현하는 RAG・AI 에이전트 실전 입문 (니시미 마사히로, 요시다 신고)
  • 주니어 백엔드 개발자가 반드시 알아야 할 실무 지식 (최범균)
  • 멀티패러다임 프로그래밍 (유인동)
  • 개정판 | 밑바닥부터 시작하는 딥러닝 1 (사이토 고키, 이복연)
  • 오브젝트 (조영호)
  • 밑바닥부터 시작하는 딥러닝 3 (사이토 고키, 이복연)

본문 끝 최상단으로 돌아가기

spinner
앱으로 연결해서 다운로드하시겠습니까?
닫기 버튼
대여한 작품은 다운로드 시점부터 대여가 시작됩니다.
앱으로 연결해서 보시겠습니까?
닫기 버튼
앱이 설치되어 있지 않으면 앱 다운로드로 자동 연결됩니다.
모바일 버전