博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
awesome-nlp
阅读量:7071 次
发布时间:2019-06-28

本文共 8536 字,大约阅读时间需要 28 分钟。

 

awesome-nlp 

A curated list of resources dedicated to Natural Language Processing

Maintainers - , 

Please read the  before contributing.

Please feel free to , or email Martin Park ()/Keon Kim () to add links.

Table of Contents

Tutorials and Courses

  • Tensor Flow Tutorial on  Models
  • Natural Language Understanding with Distributed Representation  by Cho

videos

  •  on NLP from basics
  •  on Coursera by U of Michigan
  •  course on Udacity which also covers NLP
  •  by Richard Socher
  •  by Richard Socher. Updated to make use of Tensorflow. Note that there are some lectures missing (lecture 9, and lectures 12 onwards).
  •  - course on Coursera that was only done in 2013. The videos are not available at the moment. Also Mike Collins is a great professor and his notes and lectures are very good.
  •  - a Machine Translation course with great assignments and slides.
  •  - course by  on Natural Language Processing. Good notes and some good lectures on youtube about HMM.
  •  Deep Learning course on Udacity (using Tensorflow) which covers a section on using deep learning for NLP tasks (covering Word2Vec, RNN's and LSTMs).
  •  by Harrison Kinsley(sentdex). Good tutorials with NLTK code implementation.

Deep Learning for NLP

Class by . 2016 content was updated to make use of Tensorflow. Lecture slides and reading materials for 2016 class . Videos for 2016 class . Note that there are some lecture videos missing for 2016 (lecture 9, and lectures 12 onwards). All videos for 2015 class 

 Deep Learning course on Udacity (using Tensorflow) which covers a section on using deep learning for NLP tasks. This section covers how to implement Word2Vec, RNN's and LSTMs.

Yoav Goldberg. October 2015. No new info, 75 page summary of state of the art.

Packages

Implementations

  •  by Koc AI-Lab
  •  by Mikolov
  •  by Turian
  •  by Dhillon
  •  by Huang

Libraries

  • Node.js and Javascript - Node.js Libaries for NLP

    •  - A JavaScript implementation of Twitter's text processing library
    •  - A Natural Language Processor in JS
    •  - Extensible system for analyzing and manipulating natural language
    •  - Natural Language processing in the browser
    •  - general natural language facilities for node
  • Python - Python NLP Libraries

    •  - A web mining module for the Python programming language. It has tools for natural language processing, machine learning, among others.
    •  - Providing a consistent API for diving into common natural language processing (NLP) tasks. Stands on the giant shoulders of NLTK and Pattern, and plays nicely with both.
    •  - A sentence aligner, a friendly tool for extracting parallel sentences from comparable corpora.
    •  - Chinese Words Segmentation Utilities.
    •  - A library for processing Chinese text.
    •  - A Python package for Korean natural language processing.
    •  - Text processing tools and wrappers (e.g. Vowpal Wabbit)
    •  - Python bindings for the BLLIP Natural Language Parser (also known as the Charniak-Johnson parser)
    •  - Python Natural Language Processing Library. General purpose NLP library for Python. Also contains some specific modules for parsing common NLP formats, most notably for , but also ARPA language models, Moses phrasetables, GIZA++ alignments.
    •  - Python binding to ucto (a unicode-aware rule-based tokenizer for various languages)
    •  - Python binding to Frog, an NLP suite for Dutch. (pos tagging, lemmatisation, dependency parsing, NER)
    •  - Python bindings for , a statistical part-of-speech-tagger, constiuency parser, and dependency parser for English.
    •  - Python binding to C++ library for extracting and working with with basic linguistic constructions such as n-grams and skipgrams in a quick and memory-efficient way.
    •  - Industrial strength NLP with Python and Cython.
    •  - Python interface for converting Penn Treebank trees to Stanford Dependencies.
  • C++ - C++ Libraries

    •  - C, C++, and Python tools for named entity recognition and relation extraction
    •  - Open source implementation of Conditional Random Fields (CRFs) for segmenting/labeling sequential data & other Natural Language Processing tasks.
    •  - CRFsuite is an implementation of Conditional Random Fields (CRFs) for labeling sequential data.
    •  - BLLIP Natural Language Parser (also known as the Charniak-Johnson parser)
    •  - C++ library, command line tools, and Python binding for extracting and working with basic linguistic constructions such as n-grams and skipgrams in a quick and memory-efficient way.
    •  - Unicode-aware regular-expression based tokenizer for various languages. Tool and C++ library. Supports FoLiA format.
    •  - C++ library for the 
    •  - Memory-based NLP suite developed for Dutch: PoS tagger, lemmatiser, dependency parser, NER, shallow parser, morphological analyzer.
    •  -  is a C++ Data Sciences Toolkit that facilitates mining big text data.
  • Java - Java NLP Libraries

    •  Web-Scale Open Information Extraction
    •  An efficient and flexible token-based regular expression language and engine.
    •  - Core libraries developed in the U of Illinois' Cognitive Computation Group.
  • Clojure

    •  - Natural Language Processing in Clojure (opennlp)
    •  - Rails-like inflection library for Clojure and ClojureScript
  • Ruby

    • Kevin Dias's 

Services

  •  - Natural Language Interface for apps and devices.

Articles

Review Articles

Word Vectors

Resources about word vectors, aka word embeddings, and distributed representations for words.

Word vectors are numeric representations of words that are often used as input to deep learning systems. This process is sometimes called pretraining.

 et al. 2013.
Generate word and phrase vectors. Performs well on word similarity and analogy task and includes Subsamples frequent words. (i.e. frequent words like "the" are skipped periodically to speed things up and improve vector for less frequently used words)
 in 

Chris Olah (2014) Blog post explaining word2vec.

Pennington, Socher, Manning. 2014. Creates word vectors and relates word2vec to matrix factorizations.  by 

  •  - on creating vectors to represent language, useful for RNN inputs
  •  - on word sense disambiguation
  •  - new
  •  - word representation method
  •  - similar approach, with adaptive properties

Thought Vectors

Thought vectors are numeric representations for sentences, paragraphs, and documents. The following papers are listed in order of date published, each one replaces the last as the state of the art in sentiment analysis.

Socher et al. 2013. Introduces Recursive Neural Tensor Network. Uses a parse tree.

, Mikolov. 2014. Introduces Paragraph Vector. Concatenates and averages pretrained, fixed word vectors to create vectors for sentences, paragraphs and documents. Also known as paragraph2vec. Doesn't use a parse tree.
Implemented in . See 

Irsoy & Cardie. 2014. Uses Deep Recursive Neural Networks. Uses a parse tree.

Tai et al. 2015 Introduces Tree LSTM. Uses a parse tree.

Dai, Le 2015 "With pretraining, we are able to train long short term memory recurrent networks up to a few hundred timesteps, thereby achieving strong performance in many text classification tasks, such as IMDB, DBpedia and 20 Newsgroups."

Machine Translation

 Bahdanau, Cho 2014. "comparable to the existing state-of-the-art phrase-based system on the task of English-to-French translation." Implements attention mechanism.

Sutskever, Vinyals, Le 2014. (). Uses LSTM RNNs to generate translations. " Our main result is that on an English to French translation task from the WMT’14 dataset, the translations produced by the LSTM achieve a BLEU score of 34.8"
 in

Single Exchange Dialogs

Sordoni 2015. Generates responses to tweets.
Uses  source code: 

Shang et al. 2015 Uses Neural Responding Machine. Trained on Weibo dataset. Achieves one round conversations with 75% appropriate responses.

Vinyals,  2015. Uses LSTM RNNs to generate conversational responses. Uses . Seq2Seq was originally designed for machine transation and it "translates" a single sentence, up to around 79 words, to a single sentence response, and has no memory of previous dialog exchanges. Used in Google 

Memory and Attention Models (from )

 Weston et. al 2014, and  Sukhbaatar et. al 2015.

Memory networks are implemented in . Attempts to solve task of reason attention and memory.
Weston 2015. Classifies QA tasks like single factoid, yes/no etc. Extends memory networks.
Dodge et. al 2015. Tests Memory Networks on 4 tasks including reddit dialog task.
See 

Graves et al. 2014.

Joulin, Mikolov 2015.  and 

General Natural Language Processing

  •  - LSTM representation
  •  - word vectors for machine translation
  •  - DeepMind paper
  • Tutorial on Markov Logic Networks ()

Named Entity Recognition

Neural Network

Supplementary Materials

Blogs

  • Blog Post on 
  • Blog Post on 
  •  by Hal Daumé III
  •  by Brian McFee

Credits

part of the lists are from

 

转载地址:http://bxell.baihongyu.com/

你可能感兴趣的文章
Swift4 0新特性之String、Array和Dictionary
查看>>
《程序员的自我修养》-读书笔记
查看>>
异步社区本周半价电子书(5月28日-6月03日)
查看>>
Redux应用多人协作的思路和实现
查看>>
RestTemplate实践
查看>>
FYSBIS分析报告:SOFACY的Linux后门
查看>>
javascript 代理模式(通俗易懂)
查看>>
微信小程序开发者经验分享组织成立啦~~~
查看>>
五分钟学习 Java 8 行为参数化
查看>>
Elasticsearch の 初体验|一文了解她
查看>>
聊聊flink的Triggers
查看>>
自建最轻量的react+webpack+es6架构
查看>>
聊聊reactor extra的retry
查看>>
reactor-netty中HttpClient对TcpClient的封装
查看>>
数据库安全性操作——操作原则及SQL注入
查看>>
Java网络爬虫实操(9)
查看>>
前面有一个Redux,我们去撩(聊)一下它。
查看>>
iOS开发证书"此证书的签发者无效"解决方法
查看>>
Python实现的通用树结构,支持节点索引,常数时间查找
查看>>
网络传输协议
查看>>