Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: The initial value's shape (()) is not compatible with the explicitly supplied shape argument ([11, 11, 3, 96]). #85

Open
mikechen66 opened this issue Jun 20, 2020 · 0 comments

Comments

@mikechen66
Copy link

mikechen66 commented Jun 20, 2020

I run the script of finetune.py in both TensorFlow 1.5 and TensorFlow 2.1. After solving many issues, I found out the shape incompatibility issue in the script of alexnet.py. Please help fix issue at your convenience. Appreciate your help in advance.

I get to know that there is the scope conflict between the shape and the conv argument. tf.variable_scope() usually defines global variables in the with context. It influences other related variables. For instance, shape= [filter_height, filter_width, input_channels//groups, num_filters], it denotes [11,11,3,96] in the Conv1; in contrast, Conv1 includes the arguments: 11, 11, 96, 4, 4.

1. Error Message

ValueError: The initial value's shape (()) is not compatible with the explicitly supplied shape argument ([11, 11, 3, 96]).

2. Attempted Changes

I tried to make the following changes.

1). Change the order of either the shape or the Conv1 arguments, for instance,
shape=[11,11,96,3]
or
conv(self.X, 11, 11, 4, 4, 96, name='conv1', padding='VALID')

2). Change the name of shape

kernel_shape= [filter_height, filter_width, input_channels//groups, num_filters]

3). Delete the shape and keep the argument.

[filter_height, filter_width, input_channels//groups, num_filters]

However, the following error varieties have still been persisted.

ValueError: The initial value's shape (()) is not compatible with the explicitly supplied shape argument ([11, 11, 3, 4]).

ValueError: Shapes must be equal rank, but are 4 and 0 for 'conv1/Variable/Assign' (op: 'Assign') with input shapes: [11,11,3,4], [].

ValueError: Shapes must be equal rank, but are 4 and 0 for 'conv1/Variable/Assign' (op: 'Assign') with input shapes: [11,11,3,96], [].

It is definitely the critical issue of "shape" in the second snippet. But I have not yet figured a way to solve the issues.

3. Snippets

1st snippet.

class AlexNet(object):
    .........
    def create(self):
        """Create the network graph."""
        # 1st Layer: Conv (w ReLu) -> Lrn -> Pool
        conv1 = conv(self.X, 11, 11, 4, 4, 96, name='conv1', padding='VALID')
        norm1 = lrn(conv1, 2, 2e-05, 0.75, name='norm1')
        pool1 = max_pool(norm1, 3, 3, 2, 2, name='pool1', padding='VALID')

2nd snippet:

def conv(x, filter_height, filter_width, stride_y, stride_x, num_filters, name,
    padding='SAME', groups=1):
    .........
    with tf.compat.v1.variable_scope(name) as scope:
        weights = tf.Variable('weights', shape=[filter_height, 
                                                                      filter_width, 
                                                                      input_channels//groups,
                                                                      num_filters])
        biases = tf.Variable('biases', shape=[num_filters])

4. Detailed error message

$ python finetune.py

Traceback (most recent call last):
File "finetune.py", line 91, in
model = AlexNet(x, keep_prob, num_classes, train_layers)
File "/home/mike/Documents/finetune_alexnet_with_tf/alexnet.py", line 56, in init
self.create()
File "/home/mike/Documents/finetune_alexnet_with_tf/alexnet.py", line 61, in create
conv1 = conv(self.X, 11, 11, 96, 4, 4, padding='VALID', name='conv1')
File "/home/mike/Documents/finetune_alexnet_with_tf/alexnet.py", line 147, in conv
num_filters])
File "/home/mike/miniconda3/lib/python3.7/site-packages/tensorflow_core/python/ops/variables.py", line 260, in call
return cls._variable_v2_call(*args, **kwargs)
File "/home/mike/miniconda3/lib/python3.7/site-packages/tensorflow_core/python/ops/variables.py", line 254, in _variable_v2_call
shape=shape)
File "/home/mike/miniconda3/lib/python3.7/site-packages/tensorflow_core/python/ops/variables.py", line 235, in
previous_getter = lambda **kws: default_variable_creator_v2(None, **kws)
File "/home/mike/miniconda3/lib/python3.7/site-packages/tensorflow_core/python/ops/variable_scope.py", line 2645, in default_variable_creator_v2
shape=shape)
File "/home/mike/miniconda3/lib/python3.7/site-packages/tensorflow_core/python/ops/variables.py", line 262, in call
return super(VariableMetaclass, cls).call(*args, **kwargs)
File "/home/mike/miniconda3/lib/python3.7/site-packages/tensorflow_core/python/ops/resource_variable_ops.py", line 1411, in init
distribute_strategy=distribute_strategy)
File "/home/mike/miniconda3/lib/python3.7/site-packages/tensorflow_core/python/ops/resource_variable_ops.py", line 1549, in _init_from_args
(initial_value.shape, shape))
ValueError: The initial value's shape (()) is not compatible with the explicitly supplied shape argument ([11, 11, 3, 96]).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant