Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 13981

tensor shapes error only in neural network model fit process

$
0
0

I'm constructing some neural network for a kaggle competition and i acquired a very strange error that i have no idea to fix it.

Here I constructed my model successfully and plot then with method "plot_model(model, show_shapes=True)".graph of layers who occurring strange error

In this plot graph, the layers 'cube_b1_c14_0' and 'weather_c14' has exactly same shapes (None, 24,8,3) (that's rigth), ... so I multiply then with keras layer Multiply on 'cube_b1_c14_1' and the result in this same graph is showing output shape (None, 24,8,3) (that's correct too).

On my code, the model was instantiated and compiled with no errors, ... so when i try to fit this model an error occur in this specific multiply.

strange error

It's curious that all another structure constructed on same loop are multipying with no errors.

Bellow is part of my code:...

    business_0 = layers.Lambda(lambda x: tf.gather(x, indices=[mask['is_business'][0]], axis=2), name=f'is_business_0')(_input1)    business_1 = layers.Lambda(lambda x: tf.gather(x, indices=[mask['is_business'][1]], axis=2), name=f'is_business_1')(_input1)    filter_b0, filter_b1, filter_weather = [], [], []    for ic, col_county in enumerate(mask['county']):        aux_county = layers.Lambda(lambda x: tf.gather(x, indices=[col_county], axis=2), name=f'county_{ic}')(_input1)        filter_b0 += [ layers.multiply([business_0, aux_county], name=f'filter_{ic}-0') ]# 16*(,24,1)        filter_b1 += [ layers.multiply([business_1, aux_county], name=f'filter_{ic}-1') ]# 16*(,24,1)        filter_weather += [ layers.Lambda(lambda x: tf.gather(x, indices=cube_map[ic], axis=3), name=f'weather_c{ic}')(_input2) ]    cubes_b1 = []    for county, layer in enumerate(filter_b1):        _mask = expand_to_shape(layer, (24,8, len(cube_map[county] )), name=f'cube_b1_c{county}_0')        aux = layers.multiply([ filter_weather[county],_mask ], name=f'cube_b1_c{county}_1')          cubes_b1 += [get_cube_reduction(aux, name=f'cube_b1_c{county}_2')]

...

    cube_map = {0: [10, 11, 16, 20, 21, 26],        1: [3],        2: [44, 45, 47, 48],        3: [19, 24, 25],        4: [29, 30, 37],        5: [31, 32, 38, 39],        6: [6, 7],        7: [5, 8, 12, 13, 17],        8: [42],        9: [9, 14, 15],        10: [0, 1, 2, 4],        11: [28, 35, 36, 43],        13: [27],        14: [18, 22, 23],        15: [33, 34, 40, 41, 46],        12: [6, 7]}

and function expand_to_shape is

    def expand_to_shape(tensor, shape:Union[ Tuple[int] | List[int] ], ignore_first_axis=True, name=None):        steps = []        first_axis_analisys = 1 if ignore_first_axis else 0        if tensor.shape[ 1 + first_axis_analisys ] != shape[ first_axis_analisys ]:            steps += [ layers.Lambda(lambda x: tf.repeat(x, repeats=shape[first_axis_analisys], axis=-1)) ]        _shape = shape[first_axis_analisys+1:] if ignore_first_axis else shape[first_axis_analisys]        for dim in _shape:          steps += [ layers.Lambda(lambda x: tf.expand_dims(x, axis=-1)), layers.Lambda(lambda x: tf.repeat(x, repeats=dim, axis=-1)) ]        if name is None: return Sequential(steps)(tensor)        else: return Sequential(steps, name=name)(tensor)

Some data scientist already have an error close of it?THERE ARE SOMEONE AT LEAST TO LIGHT ME A POSSIBLE DIRECTION ?

...I already :checked the shapes of both tensors involved on Multiplying process.confirmed if cube_map dictionary have the expected value.did a little structure changes to avoid some crazy memory pointing mistakes.check if there was an error on x_train, y_train data.tryed to use another layers that do same multiply operation like this.asked for Chat-GPT, Gemini and search on Google for possible errors like this.

I expect to resolve this matter to send my codes for competition.


Viewing all articles
Browse latest Browse all 13981

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>