
Keras functional API model architecture
The functional API is much better when you want to do something that diverges from the basic idea of having an input, a succession of levels, and an output, for example, models with multiple inputs, multiple outputs, or a more complex internal structure, such as using the output of a given layer as an input to multiple layers or, on the contrary, combining the output of different layers to use them together as an input of another level.
In fact, as already said, the functional API allows you to create models with greater flexibility. We can easily define models in which the levels are connected in different ways and not just from the previous level to the next. In fact, we can link a layer to any other level, thus creating complex networks.
To understand the difference between the two models that Keras offers, we will use a simple example. This is a densely connected network of the type already seen in The Keras sequential model architecture section. In a densely connected network, every input is connected to every output by a weight, which is generally followed by a non-linear activation function. Again, we recommend it for its simplicity. The first step is solved by importing the classes that we will use later for the construction of the model. We will run the example using the following steps:
We will begin by importing the required libraries using the following code block:
from keras.layers import Input, Dense
from keras.models import Model
Three layer classes have been imported: Input, Dense, and Model.
Then, we have to instantiate a Keras Tensor. In fact, in this case we must define an autonomous input level that specifies the shape of the input data (tensor). The input layer accepts a shape argument, which is a tuple indicating the dimensionality of the input data:
InputTensor = Input(shape=(100,))
This returns a tensor.
Now, we can define the layers using the following code block:
H1 = Dense(10, activation='relu')(InputTensor)
The first dense layer is created, which connects the input layer output (InputTensor) as the input to the dense layer, (x). It is this way of connecting layers (layers by layer) that gives the functional API its flexibility. A layer instance is callable on a tensor, and returns a tensor.
Let's move on to the next layer:
H2 = Dense(20, activation='relu')(H1)
So, a second dense layer is created, that connects the Dense layer output, (x), as the input to the other dense layer, (H2).
Let's move on to the final layer creation:
Output = Dense(1, activation='softmax')(H2)
Finally, a third dense layer is created that connects the Dense layer output, (H2), as the input to the other dense layer, (Output).
Now, we can create a model that includes the Input layer and three Dense layers:
model = Model(inputs=InputTensor, outputs= Output)
The model created has 100 inputs, two hidden layers with 10 and 20 neurons, and an output layer with one output.
To print a summary of the model, simply type the following command:
model.summary()
In the following screenshot, we can see the results:

All these terms will become clearer in the following chapters.