ML Comparison

08 Jan 2021

This is a constantly evolving field, so I’ll keep updating this comparison and add other frameworks as time permits.

API Element Type Keras TensorFlow (Python) Excl. tf.keras  
Model Model    
Sequential Model Sequential    
Core Layer Types      
Base Layer Base Layer    
Activation Layer Activation    
Dense Layer Dense    
Embedding Layer Embedding    
Masking Layer Masking    
Lambda Layer Lambda    
Convolution Layer Types      
Conv1D Layer Conv1D tf.nn.conv1d  
Conv2D Layer Conv2D tf.nn.conv2d  
Conv3D Layer Conv3D tf.nn.conv3d  
SeparableConv1D Layer SeparableConv1D  
SeparableConv2D Layer SeparableConv2D tf.nn.separable_conv2d  
DepthwiseConv2D Layer DepthwiseConv2D  
Conv2DTranspose Layer Conv2DTranspose tf.nn.conv2d_transpose  
Conv2DTranspose Layer Conv3DTranspose tf.nn.conv3d_transpose  
Pooling Layer Types      
MaxPooling1D layer MaxPooling1D tf.nn.max_pool1d  
MaxPooling2D layer MaxPooling2D tf.nn.max_pool2d  
MaxPooling3D layer MaxPooling3D tf.nn.max_pool3d  
AveragePooling1D layer AveragePooling1D tf.nn.avg_pool1d  
AveragePooling2D layer AveragePooling2D tf.nn.avg_pool2d  
AveragePooling3D layer AveragePooling3D  
GlobalMaxPooling1D layer GlobalMaxPooling1D  
GlobalMaxPooling2D layer GlobalMaxPooling2D  
GlobalMaxPooling3D layer GlobalMaxPooling3D  
GlobalAveragePooling1D layer GlobalAveragePooling1D  
GlobalAveragePooling2D layer GlobalAveragePooling2D  
GlobalAveragePooling3D layer GlobalAveragePooling3D  
Recurrent layers      
LSTM layer LSTM    
GRU layer GRU    
SimpleRNN layer SimpleRNN    
TimeDistributed layer TimeDistributed    
Bidirectional layer Bidirectional    
ConvLSTM2D layer ConvLSTM2D    
Base RNN layer Base RNN    
Core preprocessing layers      
TextVectorization layer TextVectorization    
Normalization layer Normalization    
Categorical data preprocessing layers      
CategoryEncoding layer CategoryEncoding    
Hashing layer Hashing    
Discretization layer Discretization    
StringLookup layer StringLookup    
IntegerLookup layer IntegerLookup    
CategoryCrossing layer CategoryCrossing    
Image preprocessing & augmentation layers      
Resizing layer Resizing    
Rescaling layer Rescaling    
CenterCrop layer CenterCrop    
RandomCrop layer RandomCrop    
RandomFlip layer RandomFlip    
RandomTranslation layer RandomTranslation    
RandomRotation layer RandomRotation    
RandomZoom layer RandomZoom    
RandomHeight layer RandomHeight    
RandomWidth layer RandomWidth    
Normalization Layers      
BatchNormalization layer BatchNormalization tf.nn.batch_normalization  
LayerNormalization layer LayerNormalization    
Regularization layers      
Dropout layer Dropout tf.nn.dropout  
SpatialDropout1D layer SpatialDropout1D    
SpatialDropout2D layer SpatialDropout2D    
SpatialDropout3D layer SpatialDropout3D    
GaussianDropout layer GaussianDropout    
GaussianNoise layer GaussianNoise    
ActivityRegularization layer ActivityRegularization    
AlphaDropout layer AlphaDropout    
Attention layers      
MultiHeadAttention layer MultiHeadAttention    
Attention layer Attention    
AdditiveAttention layer AdditiveAttention    
Reshaping layers      
Reshape layer Reshape    
Flatten layer Flatten tf.reshape(w,[-1])  
RepeatVector layer RepeatVector    
Permute layer Permute tf.transpose  
Cropping1D layer Cropping1D    
Cropping2D layer Cropping2D    
Cropping3D layer Cropping3D    
UpSampling1D layer UpSampling1D    
UpSampling2D layer UpSampling2D    
UpSampling3D layer UpSampling3D    
ZeroPadding1D layer ZeroPadding1D    
ZeroPadding2D layer ZeroPadding2D    
ZeroPadding3D layer ZeroPadding3D    
Merging layers      
Concatenate layer Concatenate    
Average layer Average    
Maximum layer Maximum    
Minimum layer Minimum    
Add layer Add    
Subtract layer Subtract    
Multiply layer Multiply    
Dot layer Dot    
Locally-connected layers      
LocallyConnected1D layer LocallyConnected1D    
LocallyConnected2D layer LocallyConnected2D    
Activation layers      
ReLU layer ReLU tf.nn.relu  
Softmax layer Softmax tf.nn.softmax  
LeakyReLU layer LeakyReLU tf.nn.leaky_relu  
PReLU layer PReLU    
ELU layer ELU tf.nn.elu  
ThresholdedReLU layer ThresholdedReLU