Overcoming catastrophic forgetting in neural networks

Another pretty good step forward in Deep Neural Networks from DeepMind. They took inspiration from neurosciences-based theories about the consolidation of previously acquired skills and memories in mammalian and human brains: connections between neurons are less likely to be overwritten if they have been important in previously learnt tasks. This mechanism is known as "synaptic consolidation". The…

New Security Threats from the IoT

Bruce Schneier on New Security Threats from the Internet of Things An article that's worth reading. The main point, that seems to me the original part of the speech, is the following equality: IoT (Internet of Things) == world-size distributed robot across the Internet. «Through the sensors, we're giving the Internet eyes and ears. Through…

Nagios Plugins for Linux v20

I'm pleased to announce the immediate, free availability of the Nagios Plugins for Linux version 20. Full details about what's included can be found in the release notes. As usual, you can download the sources from GitHub. Bug reports, feature requests, and ideas for improvements are welcome! Security fixes Some insecure data handling issues discovered by…

Nagios Plugins for Linux v19

The release 19 of the Nagios Plugins for Linux is now available for download! You can download the tarball from GitHub. As usual, bug reports, feature requests, and ideas for improvements are welcome! Fixes check_multipath Recent versions of multipath no longer open a multipathd socket file in the file system, but instead use an abstract namespace socket. Thanks…

Nagios Plugins for Linux 18 released

Here is it, version 18 of the Nagios Plugins for Linux. It's manly a bugfix release with a fix for an issue recently pointed out by Paul Dunkler: some of the plugins did not terminate with the correct return code when reaching a warning or critical threshold. The check_memory plugin no more reports as cached memory the unreclaimable…

Gradient Boosting

Gradient boosting ensemble technique for regression Gradient boosting is a machine learning technique for regression and classification  problems, which produces a prediction model in the form of an ensemble of weak prediction models, typically decision trees. It builds the model in a stage-wise fashion like other boosting methods do, and it generalizes them by allowing optimization of…