This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.
In the last part of this mini-series on forecasting with false nearest neighbors...
Nowadays, Microsoft, Google, Facebook, and OpenAI are sharing lots of state-of-t...
How can the seemingly iterative process of weighted sampling without replacement...
In a recent post, we showed how an LSTM autoencoder, regularized by false neares...
Sparklyr 1.3 is now available, featuring exciting new functionalities such as in...
In nonlinear dynamics, when the state space is thought to be multidimensional bu...
PixelCNN is a deep learning architecture - or bundle of architectures - designed...
Compared to other applications, deep learning models might not seem too likely a...
Deep learning need not be irreconcilable with privacy protection. Federated lear...
A new sparklyr release is now available. This sparklyr 1.2 release features new ...
A new release of pins is available on CRAN today. This release adds support to t...
The term "federated learning" was coined to describe a form of distributed model...
Kullback-Leibler divergence is not just used to train variational autoencoders o...
Broadcasting, as done by Python's scientific computing library NumPy, involves d...
TensorFlow 2.1, released last week, allows for mixed-precision training, making ...