Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of bounds issue with vocab size not divisible by 10 #1

Open
ftyers opened this issue Sep 17, 2016 · 0 comments
Open

Out of bounds issue with vocab size not divisible by 10 #1

ftyers opened this issue Sep 17, 2016 · 0 comments

Comments

@ftyers
Copy link
Owner

ftyers commented Sep 17, 2016

I get this error when I try and run with a vocabulary size that is not divisible by 10. Padding the vocabulary solves the problem, but it would be good to know why it is happening.

Cost graph is built
INFO:blocks.algorithms:Taking the cost gradient
INFO:blocks.algorithms:The cost gradient computation graph is built
INFO:blocks.main_loop:Entered the main loop
INFO:blocks.algorithms:Initializing the training algorithm
INFO:blocks.algorithms:The training algorithm is initialized
ERROR:blocks.main_loop:Error occured during training.

Blocks will attempt to run `on_error` extensions, potentially saving data, before exiting and reraising the error. Note that the usual `after_training` extensions will *not* be run. The original error will be re-raised and also stored in the training log. Press CTRL + C to halt Blocks immediately.

-------------------------------------------------------------------------------
BEFORE FIRST EPOCH
-------------------------------------------------------------------------------
Training status:
     batch_interrupt_received: False
     epoch_interrupt_received: False
     epoch_started: True
     epochs_done: 0
     iterations_done: 0
     received_first_batch: False
     resumed_from: None
     training_started: True
Log records from the iteration 0:
     time_initialization: 31.707899903063662

Traceback (most recent call last):
  File "morf-gen-nn/train.py", line 275, in <module>
    main_loop.run()
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/main_loop.py", line 197, in run
    reraise_as(e)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/utils/__init__.py", line 258, in reraise_as
    six.reraise(type(new_exc), new_exc, orig_exc_traceback)
  File "/usr/lib/python3/dist-packages/six.py", line 686, in reraise
    raise value
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/main_loop.py", line 183, in run
    while self._run_epoch():
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/main_loop.py", line 232, in _run_epoch
    while self._run_iteration():
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/main_loop.py", line 253, in _run_iteration
    self.algorithm.process_batch(batch)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/algorithms/__init__.py", line 190, in process_batch
    self._function(*ordered_batch)
  File "/home/fran/local/lib/python3.5/site-packages/Theano-0.8.2-py3.5.egg/theano/compile/function_module.py", line 871, in __call__
    storage_map=getattr(self.fn, 'storage_map', None))
  File "/home/fran/local/lib/python3.5/site-packages/Theano-0.8.2-py3.5.egg/theano/gof/link.py", line 314, in raise_with_op
    reraise(exc_type, exc_value, exc_trace)
  File "/usr/lib/python3/dist-packages/six.py", line 685, in reraise
    raise value.with_traceback(tb)
  File "/home/fran/local/lib/python3.5/site-packages/Theano-0.8.2-py3.5.egg/theano/compile/function_module.py", line 859, in __call__
    outputs = self.fn()
IndexError: index 69 is out of bounds for size 69
Apply node that caused the error: AdvancedSubtensor1(W, Reshape{1}.0)
Toposort index: 105
Inputs types: [TensorType(float64, matrix), TensorType(int64, vector)]
Inputs shapes: [(69, 100), (190,)]
Inputs strides: [(800, 8), (8,)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[Reshape{3}(AdvancedSubtensor1.0, MakeVector{dtype='int64'}.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
  File "morf-gen-nn/train.py", line 233, in <module>
    batch_cost = m.cost(chars, chars_mask, targets, targets_mask).sum()
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 362, in __call__
    return self.application.apply(self, *args, **kwargs)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 297, in apply
    outputs = self.application_function(brick, *args, **kwargs)
  File "morf-gen-nn/train.py", line 133, in cost
    self.fork.apply(self.lookup.apply(chars), as_dict=True),mask=chars_mask)),
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 362, in __call__
    return self.application.apply(self, *args, **kwargs)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 297, in apply
    outputs = self.application_function(brick, *args, **kwargs)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/lookup.py", line 68, in apply
    return self.W[indices.flatten()].reshape(output_shape)

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Original exception:
    IndexError: index 69 is out of bounds for size 69
Apply node that caused the error: AdvancedSubtensor1(W, Reshape{1}.0)
Toposort index: 105
Inputs types: [TensorType(float64, matrix), TensorType(int64, vector)]
Inputs shapes: [(69, 100), (190,)]
Inputs strides: [(800, 8), (8,)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[Reshape{3}(AdvancedSubtensor1.0, MakeVector{dtype='int64'}.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
  File "morf-gen-nn/train.py", line 233, in <module>
    batch_cost = m.cost(chars, chars_mask, targets, targets_mask).sum()
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 362, in __call__
    return self.application.apply(self, *args, **kwargs)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 297, in apply
    outputs = self.application_function(brick, *args, **kwargs)
  File "morf-gen-nn/train.py", line 133, in cost
    self.fork.apply(self.lookup.apply(chars), as_dict=True),mask=chars_mask)),
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 362, in __call__
    return self.application.apply(self, *args, **kwargs)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/base.py", line 297, in apply
    outputs = self.application_function(brick, *args, **kwargs)
  File "/home/fran/local/lib/python3.5/site-packages/blocks-0.2.0-py3.5.egg/blocks/bricks/lookup.py", line 68, in apply
    return self.W[indices.flatten()].reshape(output_shape)

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant