Using tools from topology and functional analysis, we provide a framework
where artificial neural networks, and their architectures, can be formally
described. We define the notion of machine in a general topological context and
show how simple machines can be combined into more complex ones. We explore
finite- and infinite-depth machines, which generalize neural networks and
neural ordinary differential equations. Borrowing ideas from functional
analysis and kernel methods, we build complete, normed, infinite-dimensional
spaces of machines, and we discuss how to find optimal architectures and
parameters -- within those spaces -- to solve a given computational problem. In
our numerical experiments, these kernel-inspired networks can outperform
classical neural networks when the training dataset is small.