Artificial Neural Networks "learn" to perform tasks by considering examples. They do this without any prior knowledge. Instead, they automatically generate identifying characteristics from the examples that they process. Beside the success of an AI model the confusion or error matrix is an important tool. It gives a risk profile we have to deal with when using AI models.
How good is good enough? How wrong is still an acceptable level? The confusion or error matrix is an important tool. It gives a risk profile we have to deal with when using AI models. It's exactly the same tool that is often used for the evaluation of medical interventions.
Evaluating the appropriate parameters of a model is the core of every machine learning algorithm. In neural networks such a procedure has to be repeated over an over. Beceause of the non-linearities numerical approaches which approximate the solution iteratively are an important class of solution.
With the increasing possibilities to gather longitudinal data, there is an interest in mining profiles in form of time series data. The key question is how to figure out and to group similarities and dissimilarities between the profiles.
We start with a complex mathematical function and end up with some decorative graphics.
Landscape generation using midpoint displacement in vectorized form
A probalistic art engine for Mondrian style images
We try a beautiful art engine and use vectorized forms to accelerate the execution.
A short code to plot the Fibonacci sequence as a curve.
Two algorithms to determine the signal in noisy data
We generate some basic signals and use convolution and windowing to re-construct ECG waves
We explore the building of mean values and interpolations when data have more than 1 dimension. We generate numerically a macro finite element so that arbitrary sized data can be anlalysed taking their inherent structure into account.
We use the Metropolis-Hasting algorithm to sample a 2-dimensional empirical distribution.
We let your robot climb the Matterhorn with a Markov Chain Monte Carlo walk
Using the coin flipping example, we give some arguments WHY the use of distribution vectors can be helpful as a preparation for Monte Carlo Markov Chain models and others and how this changes the role of medical researchers.
We give some arguments, why a change from a decision tree to a Markov model could be motivated. We provide a code of 7 lines to run a Markov model.
We provide a simple sampling engine which allows to generate random numbers that are distributed as an empirical and arbitrary distribution given as a data array.
How to create Python loops in the date-time format
We check out which numerical schema is most useful for the temporal integration of one generation of a population.
We simulate a multi center study of a diastolic blood pressure decreasing drug.
We introduce a basic schema in computational fluid dynamics for solving the 2dimensional heat equation with a source term and constant diffusivities on an equidistant rectangular grid.
By performing linear regression by a Monte Carlo method we get an estimate (mean, standard deviation, standar error) of the slope and the intercept.
We use a Monte Carlo method with a code of 6 lines for the integration of mathmatical functions. In the case of a circle we can determin π.
As a first example for numerical statistics we introduce bootstrapping which belongs to the class of Monte Carlo methods.
A short guide how to create a Pelican website for blogs with Jupyter