ValueError: Found unexpected losses or metrics that do not correspond to any Model output

I am using CSV dataset with 1 feature column (string) and 97 label columns (multi-label classification) with 1 or 0 for every row.
Data:

|    | vectorizer_input                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |   0 |   1 |   2 |   3 |   4 |   5 |   6 |   7 |   8 |   9 |   10 |   11 |   12 |   13 |   14 |   15 |   16 |   17 |   18 |   19 |   20 |   21 |   22 |   23 |   24 |   25 |   26 |   27 |   28 |   29 |   30 |   31 |   32 |   33 |   34 |   35 |   36 |   37 |   38 |   39 |   40 |   41 |   42 |   43 |   44 |   45 |   46 |   47 |   48 |   49 |   50 |   51 |   52 |   53 |   54 |   55 |   56 |   57 |   58 |   59 |   60 |   61 |   62 |   63 |   64 |   65 |   66 |   67 |   68 |   69 |   70 |   71 |   72 |   73 |   74 |   75 |   76 |   77 |   78 |   79 |   80 |   81 |   82 |   83 |   84 |   85 |   86 |   87 |   88 |   89 |   90 |   91 |   92 |   93 |   94 |   95 |   96 |
|---:|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----:|----:|----:|----:|----:|----:|----:|----:|----:|----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|-----:|
|  0 | want improve question update question on-topic stack overflow closed 8 year ago json need update {activity_code1 activity_namefootball} {activity_code1 activity_nametennis} based activity_code achieve php first need decode change data re-encode save back file                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                  |   0 |   0 |   0 |   0 |   1 |   0 |   0 |   0 |   0 |   0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    1 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |
|  1 | string xml need return xml document default returned content type text plain content rendered need content type application xml enabled option respectbrowseracceptheader serialize object xml set correct content type except object string string xml need return xml document return contentresult controller startup project.json response github good measure                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    1 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |
|  2 | volumetric imaging data consisting value sampled regular grid x z non-cubic voxel shape space adjacent point z greater x would eventually like able interpolate value arbitrary 2d plane pass volume like aware scipy.ndimage.map_coordinates case using le straightforward implicitly assumes spacing element input array equal across dimension could first resample input array according smallest voxel dimension voxels would cube use map_coordinates interpolate plane seem like great idea interpolate data twice also aware scipy various interpolators irregularly-spaced nd data linearndinterpolator nearestndinterpolator etc. slow memory-intensive purpose best way interpolating data given know value regularly spaced within dimension use map_coordinates little bit algebra let say spacing grid dx dy dz need map real world coordinate array index coordinate let define three new variable array index input map_coordinates array shape ... number dimension original data define array transform real world coordinate array index coordinate dividing scaling little broadcasting magic put together example let say want interpolate value along plane 2*y - z = 0 take two vector perpendicular plane normal vector get coordinate want interpolate convert array index coordinate interpoalte using map_coordinates last array shape 10 10 position [u_idx v_idx] value corresponding coordinate coords[ u_idx v_idx] could build idea handle interpolation coordinate start zero adding offset scaling |   1 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    1 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |
|  3 | best way calculate sum matrix a^i + a^ i+1 + a^i+2........a^n large n thought two possible way 1 use logarithmic matrix exponentiation lme a^i calculate subsequent matrix multiplying problem really take advantage lme algorithm using lowest power 2 use lme finding a^n memoize intermediate calculation problem much space required large n third way notice let logarithmic number step need compute inverse direct implementation lead log n ^2 factor keep log n computing power compute b                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |   0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    1 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    1 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |    0 |

How I load the dataset:

tfdataset = tf.data.experimental.make_csv_dataset(
    name,
    batch_size=128,
    column_names=feature_cols + label_cols,
    column_defaults=[tf.dtypes.string] + [tf.dtypes.int64] * 97,
    shuffle=False
)

How I specify multiple label columns:

def make_features_labels_tuple(x, feature_cols, label_cols):
    labels = {k: x[k] for k in x if k in label_cols}
    features = {k: x[k] for k in x if k in feature_cols}
    return features, labels

tfdataset = tfdataset.map(lambda x: make_features_labels_tuple(x, feature_cols, label_cols))

This is my simple model right now:

vectorizer = layers.TextVectorization(max_tokens=4000, output_sequence_length=4000)

model = models.Sequential(
    [
        vectorizer,
        layers.Embedding(4000, 64),
        layers.GlobalMaxPool1D(),
        layers.Dense(97, activation='sigmoid')
    ]
)

model.compile(
    optimizer='adam',
    loss=losses.BinaryCrossentropy(),
    metrics=[metrics.Accuracy(), metrics.Precision(), metrics.Recall()]
)

Model summary:

_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 text_vectorization_44 (Text  (None, 4000)             0         
 Vectorization)                                                  
                                                                 
 embedding_16 (Embedding)    (None, 4000, 64)          256000    
                                                                 
 global_max_pooling1d_16 (Gl  (None, 64)               0         
 obalMaxPooling1D)                                               
                                                                 
 dense_16 (Dense)            (None, 97)                6305      
                                                                 
=================================================================
Total params: 262,305
Trainable params: 262,305
Non-trainable params: 0
_________________________________________________________________

And this is error I get when I try model.fit:

ValueError: Found unexpected losses or metrics that do not correspond to any Model output: dict_keys(['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', '10', '11', '12', '13', '14', '15', '16', '17', '18', '19', '20', '21', '22', '23', '24', '25', '26', '27', '28', '29', '30', '31', '32', '33', '34', '35', '36', '37', '38', '39', '40', '41', '42', '43', '44', '45', '46', '47', '48', '49', '50', '51', '52', '53', '54', '55', '56', '57', '58', '59', '60', '61', '62', '63', '64', '65', '66', '67', '68', '69', '70', '71', '72', '73', '74', '75', '76', '77', '78', '79', '80', '81', '82', '83', '84', '85', '86', '87', '88', '89', '90', '91', '92', '93', '94', '95', '96']). Valid mode output names: ['dense_7']. Received struct is: {'0': <tf.Tensor 'IteratorGetNext:1' shape=(128,) dtype=int64>, '1': <tf.Tensor 'IteratorGetNext:2' shape=(128,) dtype=int64>, '2': <tf.Tensor 'IteratorGetNext:13' shape=(128,) dtype=int64>, '3': <tf.Tensor 'IteratorGetNext:24' shape=(128,) dtype=int64>, '4': <tf.Tensor 'IteratorGetNext:35' shape=(128,) dtype=int64>, '5': <tf.Tensor 'IteratorGetNext:46' shape=(128,) dtype=int64>, '6': <tf.Tensor 'IteratorGetNext:57' shape=(128,) dtype=int64>, '7': <tf.Tensor 'IteratorGetNext:68' shape=(128,) dtype=int64>, '8': <tf.Tensor 'IteratorGetNext:79' shape=(128,) dtype=int64>, ...