Picture for Takashi Morita

Takashi Morita

Oscillations enhance time-series prediction in reservoir computing with feedback

Add code
Jun 05, 2024
Viaarxiv icon

Positional Encoding Helps Recurrent Neural Networks Handle a Large Vocabulary

Add code
Jan 31, 2024
Viaarxiv icon

Adaptive Uncertainty-Guided Model Selection for Data-Driven PDE Discovery

Add code
Aug 31, 2023
Viaarxiv icon

Noise-aware Physics-informed Machine Learning for Robust PDE Discovery

Add code
Jul 04, 2022
Figure 1 for Noise-aware Physics-informed Machine Learning for Robust PDE Discovery
Figure 2 for Noise-aware Physics-informed Machine Learning for Robust PDE Discovery
Figure 3 for Noise-aware Physics-informed Machine Learning for Robust PDE Discovery
Figure 4 for Noise-aware Physics-informed Machine Learning for Robust PDE Discovery
Viaarxiv icon

Exploring TTS without T Using Biologically/Psychologically Motivated Neural Network Modules (ZeroSpeech 2020)

Add code
May 15, 2020
Figure 1 for Exploring TTS without T Using Biologically/Psychologically Motivated Neural Network Modules (ZeroSpeech 2020)
Figure 2 for Exploring TTS without T Using Biologically/Psychologically Motivated Neural Network Modules (ZeroSpeech 2020)
Figure 3 for Exploring TTS without T Using Biologically/Psychologically Motivated Neural Network Modules (ZeroSpeech 2020)
Figure 4 for Exploring TTS without T Using Biologically/Psychologically Motivated Neural Network Modules (ZeroSpeech 2020)
Viaarxiv icon

Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State

Add code
Mar 08, 2019
Figure 1 for Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State
Figure 2 for Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State
Figure 3 for Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State
Figure 4 for Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State
Viaarxiv icon

Superregular grammars do not provide additional explanatory power but allow for a compact analysis of animal song

Add code
Nov 05, 2018
Figure 1 for Superregular grammars do not provide additional explanatory power but allow for a compact analysis of animal song
Figure 2 for Superregular grammars do not provide additional explanatory power but allow for a compact analysis of animal song
Figure 3 for Superregular grammars do not provide additional explanatory power but allow for a compact analysis of animal song
Figure 4 for Superregular grammars do not provide additional explanatory power but allow for a compact analysis of animal song
Viaarxiv icon

RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency

Add code
Sep 05, 2018
Figure 1 for RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency
Figure 2 for RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency
Figure 3 for RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency
Figure 4 for RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency
Viaarxiv icon

What do RNN Language Models Learn about Filler-Gap Dependencies?

Add code
Aug 31, 2018
Figure 1 for What do RNN Language Models Learn about Filler-Gap Dependencies?
Figure 2 for What do RNN Language Models Learn about Filler-Gap Dependencies?
Figure 3 for What do RNN Language Models Learn about Filler-Gap Dependencies?
Figure 4 for What do RNN Language Models Learn about Filler-Gap Dependencies?
Viaarxiv icon