Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

also having issues with the demo #14

Open
ClaireleNobel opened this issue Mar 6, 2017 · 6 comments
Open

also having issues with the demo #14

ClaireleNobel opened this issue Mar 6, 2017 · 6 comments

Comments

@ClaireleNobel
Copy link

This is what I get on step 2. I tried working with the fixes that people suggested on the other posts, but have still not had any luck. Any suggestions would be greatly appreciated!

ValueErrorTraceback (most recent call last)
in ()
6 model = Model(num_time_samples=num_time_samples,
7 num_channels=num_channels,
----> 8 gpu_fraction=gpu_fraction)
9
10 Audio(inputs.reshape(inputs.shape[1]), rate=44100)

/root/shared/fast-wavenet-master/wavenet/models.py in init(self, num_time_samples, num_channels, num_classes, num_blocks, num_layers, num_hidden, gpu_fraction)
34 for i in range(num_layers):
35 rate = 2**i
---> 36 name = 'b{}-l{}'.format(b, i)
37 h = dilated_conv1d(h, num_hidden, rate=rate, name=name)
38 hs.append(h)

/root/shared/fast-wavenet-master/wavenet/layers.py in dilated_conv1d(inputs, out_channels, filter_width, rate, padding, name, gain, activation)
142 padding=padding,
143 gain=gain,
--> 144 activation=activation)
145 , conv_out_width, _ = outputs.get_shape().as_list()
146 new_width = conv_out_width * rate

/root/shared/fast-wavenet-master/wavenet/layers.py in conv1d(inputs, out_channels, filter_width, stride, padding, data_format, gain, activation, bias)
89 w = tf.get_variable(name='w',
90 shape=(filter_width, in_channels, out_channels),
---> 91 initializer=w_init)
92
93 outputs = tf.nn.conv1d(inputs,

/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/variable_scope.pyc in get_variable(name, shape, dtype, initializer, regularizer, trainable, collections, caching_device, partitioner, validate_shape, custom_getter)
986 collections=collections, caching_device=caching_device,
987 partitioner=partitioner, validate_shape=validate_shape,
--> 988 custom_getter=custom_getter)
989 get_variable_or_local_docstring = (
990 """%s

/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/variable_scope.pyc in get_variable(self, var_store, name, shape, dtype, initializer, regularizer, trainable, collections, caching_device, partitioner, validate_shape, custom_getter)
888 collections=collections, caching_device=caching_device,
889 partitioner=partitioner, validate_shape=validate_shape,
--> 890 custom_getter=custom_getter)
891
892 def _get_partitioned_variable(self,

/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/variable_scope.pyc in get_variable(self, name, shape, dtype, initializer, regularizer, reuse, trainable, collections, caching_device, partitioner, validate_shape, custom_getter)
346 reuse=reuse, trainable=trainable, collections=collections,
347 caching_device=caching_device, partitioner=partitioner,
--> 348 validate_shape=validate_shape)
349
350 def _get_partitioned_variable(

/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/variable_scope.pyc in _true_getter(name, shape, dtype, initializer, regularizer, reuse, trainable, collections, caching_device, partitioner, validate_shape)
331 initializer=initializer, regularizer=regularizer, reuse=reuse,
332 trainable=trainable, collections=collections,
--> 333 caching_device=caching_device, validate_shape=validate_shape)
334
335 if custom_getter is not None:

/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/variable_scope.pyc in _get_single_variable(self, name, shape, dtype, initializer, regularizer, partition_info, reuse, trainable, collections, caching_device, validate_shape)
637 " Did you mean to set reuse=True in VarScope? "
638 "Originally defined at:\n\n%s" % (
--> 639 name, "".join(traceback.format_list(tb))))
640 found_var = self._vars[name]
641 if not shape.is_compatible_with(found_var.get_shape()):

ValueError: Variable b0-l0/w already exists, disallowed. Did you mean to set reuse=True in VarScope? Originally defined at:

File "wavenet/layers.py", line 91, in conv1d
initializer=w_init)
File "wavenet/layers.py", line 144, in dilated_conv1d
activation=activation)
File "wavenet/models.py", line 36, in init
name = 'b{}-l{}'.format(b, i)

@tomlepaine
Copy link
Owner

Does this demo occur when you run the demo notebook?

Did you try restarting the notebook and running all cells in order?

It sounds like for some reason the same variable is being created twice when you run it.

@ClaireleNobel
Copy link
Author

Yes, this happens when I'm running on Jupyter. I just tried restarting the notebook and running the cells in order... but this happens when I'm on just the second cell... I have attached a screenshot so that you can see colours. I really hope this works!

screen shot 2017-03-06 at 4 39 46 pm

@brannondorsey
Copy link

brannondorsey commented Apr 19, 2017

I believe this is an error that occurs when you are running with tensorflow version >=1.0. I received this error, downgraded to tensorflow 0.12.0, and that fixed it.

pip uninstall tensorflow # tensorflow-gpu
pip install tensorflow==0.12 # tensorflow-gpu==0.12

@tnarik
Copy link

tnarik commented Jul 23, 2017

Yes, the issue is the version of tensorflow. It would be great if the project included a requirements.txt file or similar to ensure consistency.

Failing this, you can easily to TensorFlow 1.1.0 at least (didn't try my modification on the new 1.2).

@ishandutta2007
Copy link

ishandutta2007 commented Nov 5, 2018

@tnarik @brannondorsey @WeiHan3 if you have succeeded in training it now, can you please share the model.

@aryachiranjeev
Copy link

i am running it on colab and facing this error
@tnarik can you help with this

TypeError Traceback (most recent call last)
in ()
7 model = Model(num_time_samples=num_time_samples,
8 num_channels=num_channels,
----> 9 gpu_fraction=gpu_fraction)
10
11 Audio(inputs.reshape(inputs.shape[1]), rate=44100)

6 frames
/content/fastwavenet/wavenet/models.py in init(self, num_time_samples, num_channels, num_classes, num_blocks, num_layers, num_hidden, gpu_fraction)
34 h = dilated_conv1d(h, num_hidden, rate=rate, name=name)
35 hs.append(h)
---> 36
37 outputs = conv1d(h,
38 num_classes,

/content/fastwavenet/wavenet/layers.py in dilated_conv1d(inputs, out_channels, filter_width, rate, padding, name, gain, activation)
136 with tf.variable_scope(name):
137 , width, _ = inputs.get_shape().as_list()
--> 138 inputs
= time_to_batch(inputs, rate=rate)
139 outputs_ = conv1d(inputs_,
140 out_channels=out_channels,

/content/fastwavenet/wavenet/layers.py in time_to_batch(inputs, rate)
24 padded = tf.pad(inputs, [[0, 0], [pad_left, 0], [0, 0]])
25 transposed = tf.transpose(padded, perm)
---> 26 reshaped = tf.reshape(transposed, shape)
27 outputs = tf.transpose(reshaped, perm)
28 return outputs

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/array_ops.py in reshape(tensor, shape, name)
129 A Tensor. Has the same type as tensor.
130 """
--> 131 result = gen_array_ops.reshape(tensor, shape, name)
132 tensor_util.maybe_set_static_shape(result, shape)
133 return result

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/gen_array_ops.py in reshape(tensor, shape, name)
8113 # Add nodes to the TensorFlow graph.
8114 _, _, _op = _op_def_lib._apply_op_helper(
-> 8115 "Reshape", tensor=tensor, shape=shape, name=name)
8116 _result = _op.outputs[:]
8117 _inputs_flat = _op.inputs

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/framework/op_def_library.py in _apply_op_helper(self, op_type_name, name, **keywords)
630 _SatisfiesTypeConstraint(base_type,
631 _Attr(op_def, input_arg.type_attr),
--> 632 param_name=input_name)
633 attrs[input_arg.type_attr] = attr_value
634 inferred_from[input_arg.type_attr] = input_name

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/framework/op_def_library.py in _SatisfiesTypeConstraint(dtype, attr_def, param_name)
59 "allowed values: %s" %
60 (param_name, dtypes.as_dtype(dtype).name,
---> 61 ", ".join(dtypes.as_dtype(x).name for x in allowed_list)))
62
63

TypeError: Value passed to parameter 'shape' has DataType float32 not in list of allowed values: int32, int64

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants