In an attempt to capture the fundamental features that are common to neural networks, we define a parameterized Neural Abstract Machine (NAM) in such a way that the major neural networks in the literature can be described as natural extensions or refinements of the NAM. We illustrate the refinement for feedforward networks with back-propagation training. The NAM provides a platform and programming language independent basis for a comparative mathematical and experimental analysis and evaluation of different implementations of neural networks. We concentrate our attention here on the computational core (Neural Kernel NK) and provide abstract interfaces for the other NAM components.