Commit dee6a3c1 authored by M. Francois's avatar M. Francois

Upgrade to pytorch>1 and add ZeroPoleFilters

parent f420f932
.idea/
.vscode/
*~
docs/html
......
MIT License
Copyright (c) 2019 François MARELLI (Idiap research institute)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
License
-------
Copyright (c) 2021 Idiap Research Institute, http://www.idiap.ch/
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
**************
Neural Filters
**************
This package implements Neural Filters related to the following paper:
"François Marelli, Bastian Schnell, Hervé Bourlard, Thierry Dutoit, and Philip N. Garner. An end-to-end network to synthesize intonation using a generalized command response model. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Brighton, UK, May 2019"
Neural Filters are linear IIR filters integrated in the PyTorch framework, which can be interfaced with neural networks and trained by gradient descent.
If you use this package and wish to aknowledge it in an academic publication, please cite the paper.
(c) Idiap Research Institute, Francois Marelli <francois.marelli@idiap.ch>
Install instructions
====================
run `pip install git+https://gitlab.idiap.ch/software/neural_filters.git`
Welcome to Neural Filters’s documentation!
******************************************
NeuralFilterCell
================
This module implements a basic trainable all-pole first order filter
using pyTorch
Copyright (c) 2019 Idiap Research Institute, http://www.idiap.ch/
Written by Francois Marelli <Francois.Marelli@idiap.ch>
This file is part of neural_filters.
class neural_filters.neural_filter.NeuralFilter(hidden_size)
A trainable first-order all-pole filter \frac{1}{1 - P z^{-1}}
* **hidden_size** (int) - the size of the data vector
forward(input_var, hidden=None)
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note: Although the recipe for forward pass needs to be defined
within this function, one should call the "Module" instance
afterwards instead of this since the former takes care of
running the registered hooks while the latter silently ignores
them.
NeuralFilter2R
==============
This module implements a trainable all-pole second order filter with
real poles using pyTorch
Copyright (c) 2019 Idiap Research Institute, http://www.idiap.ch/
Written by Francois Marelli <Francois.Marelli@idiap.ch>
This file is part of neural_filters.
class neural_filters.neural_filter_2R.NeuralFilter2R(hidden_size)
A trainable second-order all-(real)pole filter \frac{1}{1 - P_{1}
z^{-1}} \frac{1}{1 - P_{2} z^{-1}}
* **hidden_size** (int) - the size of data vector
forward(input_var, hx=(None, None))
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note: Although the recipe for forward pass needs to be defined
within this function, one should call the "Module" instance
afterwards instead of this since the former takes care of
running the registered hooks while the latter silently ignores
them.
NeuralFilter2CD
===============
This module implements a trainable critically damped all-pole second
order filter with real poles using pyTorch
Copyright (c) 2019 Idiap Research Institute, http://www.idiap.ch/
Written by Francois Marelli <Francois.Marelli@idiap.ch>
This file is part of neural_filters.
class neural_filters.neural_filter_2CD.NeuralFilter2CD(hidden_size)
A trainable second-order critically damped all-pole filter
\frac{1}{(1 - P z^{-1})^{2}}
* **hidden_size** (int) - the size of data vector
forward(input_var, hx=(None, None))
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note: Although the recipe for forward pass needs to be defined
within this function, one should call the "Module" instance
afterwards instead of this since the former takes care of
running the registered hooks while the latter silently ignores
them.
NeuralFilter2CC
===============
This module implements a trainable all-pole second order filter with
complex conjugate poles using pyTorch
Copyright (c) 2019 Idiap Research Institute, http://www.idiap.ch/
Written by Francois Marelli <Francois.Marelli@idiap.ch>
This file is part of neural_filters.
class neural_filters.neural_filter_2CC.NeuralFilter2CC(hidden_size)
A trainable second-order all-pole filter \frac{1}{1 - 2 P
\cos(\theta) z^{-1} + P^{2} z^{-2}}
* **hidden_size** (int) - the size of the data vector
forward(input_var, hidden=None)
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note: Although the recipe for forward pass needs to be defined
within this function, one should call the "Module" instance
afterwards instead of this since the former takes care of
running the registered hooks while the latter silently ignores
them.
Indices and tables
******************
* Index
* Search Page
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
all:
make doc
.PHONY: help
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " epub3 to make an epub3"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
@echo " dummy to check syntax errors of document sources"
.PHONY: doc
doc:
make clean
make html
make text
rm -rf ../html
rm -f ../README.rst
cp -r $(BUILDDIR)/html ../html
cp $(BUILDDIR)/text/index.txt ../README.rst
.PHONY: clean
clean:
rm -rf $(BUILDDIR)/*
.PHONY: html
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
.PHONY: dirhtml
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
.PHONY: singlehtml
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
.PHONY: pickle
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
.PHONY: json
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
.PHONY: htmlhelp
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
.PHONY: qthelp
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/NeuralFilters.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/NeuralFilters.qhc"
.PHONY: applehelp
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
.PHONY: devhelp
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/NeuralFilters"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/NeuralFilters"
@echo "# devhelp"
.PHONY: epub
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY: epub3
epub3:
$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
@echo
@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."
.PHONY: latex
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
.PHONY: latexpdf
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: latexpdfja
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: text
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
.PHONY: man
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
.PHONY: texinfo
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
.PHONY: info
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
.PHONY: gettext
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
.PHONY: changes
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY: linkcheck
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY: doctest
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
.PHONY: coverage
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
.PHONY: xml
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
.PHONY: pseudoxml
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
.PHONY: dummy
dummy:
$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
@echo
@echo "Build finished. Dummy builder generates no files."
This diff is collapsed.
.. Neural Filters documentation master file, created by
sphinx-quickstart on Fri Feb 16 15:08:23 2018.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to Neural Filters's documentation!
==========================================
.. toctree::
:maxdepth: 2
:caption: Contents:
.. automodule:: neural_filters.neural_filter
:members:
.. automodule:: neural_filters.neural_filter_2R
:members:
.. automodule:: neural_filters.neural_filter_2CD
:members:
.. automodule:: neural_filters.neural_filter_2CC
:members:
Indices and tables
==================
* :ref:`genindex`
* :ref:`search`
"""
neural_filters
Neural Filters
**************
Copyright (c) 2018 Idiap Research Institute, http://www.idiap.ch/
This package implements a trainable all-pole second order filter with complex conjugate poles using pyTorch.
Written by Francois Marelli <Francois.Marelli@idiap.ch>
"""
This file is part of neural_filters.
# Copyright (c) 2021 Idiap Research Institute, http://www.idiap.ch/
# Written by François Marelli <francois.marelli@idiap.ch>
#
# This file is part of Neural Filters.
#
# Neural Filters is free software: you can redistribute it and/or modify
# it under the terms of the 3-Clause BSD License.
#
# Neural Filters is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# 3-Clause BSD License for more details.
#
# You should have received a copy of the 3-Clause BSD License along
# with Neural Filters. If not, see https://opensource.org/licenses/BSD-3-Clause.
#
# SPDX-License-Identifier: BSD-3-Clause
import pkg_resources
from abc import ABC, abstractmethod
from torch import nn
from torch.nn.utils.rnn import pad_packed_sequence, pack_padded_sequence, PackedSequence
from .lfilter_grad import lfilter
from .neural_filter_2CC import *
from .zero_pole_filter import *
"""
import numpy as np
__version__ = pkg_resources.get_distribution('neural_filters').version
EPSILON = 1e-6
INIT_MODULUS = 0.95
MIN_ANGLE = 0
MAX_ANGLE = np.pi / 2
class FilterBase(nn.Module, ABC):
"""
A trainable filter
"""
def asig(x):
if not isinstance(x, np.ndarray):
x = np.array(x, ndmin=1)
def __init__(self, batch_first=False):
super().__init__()
x[x == 1] = 1 - EPSILON
x[x == 0] = EPSILON
# Transpose dimensions to bring time last
self.tdims = (-1, 0)
return -np.log((1 / x) - 1)
# Is the batch dimension the first?
if batch_first:
self.tdims = (-1, 1)
def forward(self, input_var):
"""
Apply the filter
def atanh(x):
if not isinstance(x, np.ndarray):
x = np.array(x, ndmin=1)
Parameters
----------
input_var : Tensor
dimensions should be:
(time, batch, *features) if batch_first=False
(batch, time, *features) if batch_first=True
x[abs(x) == 1] *= (1 - EPSILON)
return np.arctanh(x)
Returns
-------
filtered : Tensor
dimensions match the input_var
"""
is_packed = isinstance(input_var, PackedSequence)
if is_packed:
input_var, batch_sizes = pad_packed_sequence(input_var)
from .neural_filter import *
from .neural_filter_2CC import *
from .neural_filter_2CD import *
from .neural_filter_2R import *
# Compute coefficients for numerator and denominator
a_coef, b_coef = self.coeffs()
# Transpose time to last dimension
input_var = input_var.transpose(*self.tdims)
# Apply autograd lfilter
output = lfilter(input_var, a_coef, b_coef)
# Transpose time back in original position
output = output.transpose(*self.tdims)
if is_packed:
output = pack_padded_sequence(output, batch_sizes)
return output
@abstractmethod
def coeffs(self):
pass
# Copyright (c) 2021 Idiap Research Institute, http://www.idiap.ch/
# Written by François Marelli <francois.marelli@idiap.ch>
#
# This file is part of Neural Filters.
#
# Neural Filters is free software: you can redistribute it and/or modify
# it under the terms of the 3-Clause BSD License.
#
# Neural Filters is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# 3-Clause BSD License for more details.
#
# You should have received a copy of the 3-Clause BSD License along
# with Neural Filters. If not, see https://opensource.org/licenses/BSD-3-Clause.
#
# SPDX-License-Identifier: BSD-3-Clause
import torch
import torchaudio
import torch.nn.functional as F
class LfilterGrad(torch.autograd.Function):
@staticmethod
def forward(ctx, wf, a_coef, b_coef):
"""
Perform an IIR filter by evaluating difference equation, with autograd support.
Args:
waveform (Tensor): audio waveform of dimension of ``(..., time)``.
a_coeffs (Tensor): denominator coefficients of difference equation of dimension of ``(n_order + 1)``.
Lower delays coefficients are first, e.g. ``[a0, a1, a2, ...]``.
Must be same size as b_coeffs (pad with 0's as necessary).
b_coeffs (Tensor): numerator coefficients of difference equation of dimension of ``(n_order + 1)``.
Lower delays coefficients are first, e.g. ``[b0, b1, b2, ...]``.
Must be same size as a_coeffs (pad with 0's as necessary).
Returns:
Tensor: Waveform with dimension of ``(..., time)``.
"""
with torch.no_grad():
hidden_b = torch.zeros_like(a_coef)
hidden_b[0] = 1
# Compute the poles-only filter
hidden = torchaudio.functional.lfilter(wf, a_coef, hidden_b, False)
# Compute the zeros filter
y = hidden.view(-1, 1, hidden.shape[-1])
# Pad signal with 0 at beginning only
y = F.pad(y, [b_coef.numel() - 1, 0])
y = F.conv1d(y, b_coef.flip(0).view(1, 1, -1))
# Reshape the signal back to
y = y.view(*hidden.shape)
ctx.save_for_backward(wf, a_coef, b_coef, hidden)
return y
@staticmethod
def backward(ctx, grad_output):
wf, a_coef, b_coef, hidden = ctx.saved_tensors
grad_wf = None
grad_a = None
grad_b = None
# Count the number of signals (batch_size * features)
batch = wf.numel() // wf.shape[-1]
with torch.no_grad():
if ctx.needs_input_grad[2]:
grad_b = F.conv1d(F.pad(hidden.view(1, -1, hidden.shape[-1]), [b_coef.numel() - 1, 0]),
grad_output.view(-1, 1,
grad_output.shape[-1]),
groups=batch).sum((0, 1)).flip(0)
if ctx.needs_input_grad[0] or ctx.needs_input_grad[1]:
grad_hidden = F.conv1d(F.pad(grad_output.view(-1, 1, grad_output.shape[-1]), [0, b_coef.numel() - 1]),
b_coef.view(1, 1, -1)).view(*grad_output.shape)
hidden_b = torch.zeros_like(a_coef)
if ctx.needs_input_grad[0]:
hidden_b[0] = 1
grad_wf = torchaudio.functional.lfilter(
grad_hidden.flip(-1), a_coef, hidden_b, False).flip(-1)
if ctx.needs_input_grad[1]:
hidden_b[0] = -1
dh_da = torchaudio.functional.lfilter(
hidden, a_coef, hidden_b, False)
grad_a = F.conv1d(F.pad(dh_da.view(1, -1, dh_da.shape[-1]), [b_coef.numel() - 1, 0]),
grad_hidden.view(-1, 1,
grad_hidden.shape[-1]),
groups=batch).sum((0, 1)).flip(0)
return grad_wf, grad_a, grad_b
lfilter = LfilterGrad.apply
"""
NeuralFilterCell
**************
This module implements a basic trainable all-pole first order filter using pyTorch
Copyright (c) 2019 Idiap Research Institute, http://www.idiap.ch/
Written by Francois Marelli <Francois.Marelli@idiap.ch>
This file is part of neural_filters.
"""
import numpy as np
import torch
from torch.nn import Parameter
from torch.nn._functions.rnn import Recurrent, VariableRecurrent
from torch.nn.utils.rnn import PackedSequence