Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restoring parameters for training DocumentQA #54

Open
alontalmor opened this issue Oct 28, 2018 · 0 comments
Open

Restoring parameters for training DocumentQA #54

alontalmor opened this issue Oct 28, 2018 · 0 comments

Comments

@alontalmor
Copy link

alontalmor commented Oct 28, 2018

I'm a PhD candidate of Jonathan Berant's, and we are trying to continue
training from a saved checkpoint using your model DocumentQA.

Is this option supported in the code? and what is the best way to do this?

To be more specific : we use ablate_triviaqa_unfiltered.py as our training script.
and it seems "checkpoint" and "parameter_checkpoint" should support this function.
However it is unclear why there are to different variables for that, and why are they called twice:

in _train_async() in trainer.py:

Line 501: (notive that checkpoint is saved and not
parameter_checkpoint is this a bug? )
if parameter_checkpoint is not None:
print("Restoring parameters from %s" % parameter_checkpoint)
saver = tf.train.Saver()
saver.restore(sess, checkpoint)
saver = None

Line 351:
if checkpoint is not None:
print("Restoring from checkpoint...")
saver.restore(sess, checkpoint)
print("Loaded checkpoint: " + str(sess.run(global_step)))
else:
print("Initializing parameters...")
sess.run(tf.global_variables_initializer())

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant