Ваш любимый книжный интернет-магазин
Перейти на
GlavKniga.SU
Ваш город: Алматы
Ваше местоположение – Алматы
 Да 
От вашего выбора зависит время и стоимость доставки
Корзина: пуста
Авторизация 
  Логин
  
  Пароль
  
Регистрация  Забыли пароль?

Поиск по каталогу 
(строгое соответствие)
ISBN
Фраза в названии или аннотации
Автор
Язык книги
Год издания
с по
Электронный носитель
Тип издания
Вид издания
Отрасли экономики
Отрасли знаний
Сферы деятельности
Надотраслевые технологии
Разделы каталога
худ. литературы

Recurrent Neural Network based Probabilistic Language Model. Speech Recognition with Probabilistic Language Model

В наличии
Местонахождение: АлматыСостояние экземпляра: новый
Бумажная
версия
Автор: Sathyanarayanan Kuppusami
ISBN: 9786202205443
Год издания: 2017
Формат книги: 60×90/16 (145×215 мм)
Количество страниц: 60
Издательство: AV Akademikerverlag
Цена: 16526 тг
Положить в корзину
Позиции в рубрикаторе
Отрасли знаний:
Код товара: 180124
Способы доставки в город Алматы *
комплектация (срок до отгрузки) не более 2 рабочих дней
Самовывоз из города Алматы (пункты самовывоза партнёра CDEK)
Курьерская доставка CDEK из города Москва
Доставка Почтой России из города Москва
      Аннотация: Statistical n-gram language models are widely used for their state of the art performance in a continuous speech recognition system. In a domain based scenario, the sequences vary at large for expressing same context by the speakers. But, holding all possible sequences in training corpora for estimating n-gram probabilities is practically difficult. Capturing long distance dependencies from a sequence is an important feature in language models that can provide non zero probability for a sparse sequence during recognition. A simpler back-off n-gram model has a problem of estimating the probabilities for sparse data, if the size of n gram increases. Also deducing knowledge from training patterns can help the language models to generalize on an unknown sequence or word by its linguistic properties like noun, singular or plural, novel position in a sentence. For a weaker generalization, n-gram model needs huge sizes of corpus for training. A simple recurrent neural network based language model approach is proposed here to efficiently overcome the above difficulties for domain based corpora.
Ключевые слова: Recurrent Neural Network, language model, probabilistic neural model
Похожие издания
Отрасли знаний: Наука в целом
SREE LAKSHMI DONE
Convolution Recurrent Neural Networks for Image Classification. Enhancing Visual Communication and Expression in Instant Messaging Platforms.
1905 г.,  72 стр.,  мягкий переплет
"Utilizing Convolution Recurrent Neural Networks for Image Classification in Instant Messengers" delves into the integration of CRNN models within instant messengers for effective image classification. This book explores how CRNN techniques can enhance the accuracy and efficiency of image recognition, enabling seamless visual...

27521 тг
Бумажная версия
Отрасли знаний: Точные науки -> Математика
Kassahun Tesfaye
Recurrent Neural Network. Artificial and Natural Neural Networks.
1905 г.,  56 стр.,  мягкий переплет
This book is about neural networks in general and recurrent neural network in particular. Neural networks are networks of biological neurons. This definition is the traditional one, where a neural network was considered as a simple calculator. However, modern usage of the term refers to they are networks of Artificial neurons, which are simple to...

26952 тг
Бумажная версия
Отрасли знаний: Естественные науки -> Биология
Dilpreet Kaur and Shailendra Singh
Protein-Protein Interactions Classification. Based on Recurrent Neural Network.
2012 г.,  88 стр.,  мягкий переплет
Proteomics is the large-scale study of proteins, particularly their structures and functions. Proteins are vital parts of living organisms, as they are the main components of the physiological metabolic pathways of cells. Most proteins function in collaboration with other proteins and one goal of proteomics is to identify which proteins interact....

30784 тг
Бумажная версия
Отрасли знаний: Точные науки -> Информатика и программирование -> Информационные технологии
Conrad Tiflin
LSTM Recurrent Neural Networks for Signature Verification. A Novel Approach.
2012 г.,  104 стр.,  мягкий переплет
The author investigated the application of Long Short-Term Memory (LSTM) Recurrent Neural Networks (RNNs) to the task of signature verification. Traditional RNNs are capable of modeling dynamical systems with hidden states; they have been successfully applied to domains ranging from financial forecasting to control and speech recognition. This...

31353 тг
Бумажная версия
Отрасли экономики: Промышленность в целом
Sanjay Kumar Boddhu
Towards Building a Neuromorphic Computer. A Reconfigurable Hardware Continuous Time Recurrent Neural Network.
2014 г.,  108 стр.,  мягкий переплет
In this work, a complete description of the design for implementing a neuromorphic computer has been provided with details of the components and concepts used to make it feasible in hardware. The final hardware implemented, is tested with well-defined and well-analyzed configurations and the hardware behaves in concurrence with its simulated...

37255 тг
Бумажная версия
Отрасли знаний: Гуманитарные науки -> Психология
Tarik Rashid
Recurrent Neural Network Model. .
2013 г.,  172 стр.,  мягкий переплет
Neural networks deviate from other models by their ability to map inputs to the outputs and build complex relationships among variables without specifying them explicitly. In this work we provide an extensive literature survey of the related problems and study several approaches, including conventional predictive methods. As a result of our...

42170 тг
Бумажная версия