Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GHMM training #41

Open
renatocf opened this issue Sep 23, 2015 · 1 comment
Open

GHMM training #41

renatocf opened this issue Sep 23, 2015 · 1 comment
Labels

Comments

@renatocf
Copy link
Member

Implement segseq logic inside GHMM static method train and creator.

The intended interface is:

/* Client code */
auto ghmm_trainer = GHMM::labelingTrainer(state_trainers);
ghmm_trainer->addTrainingSet(seq1);
ghmm_trainer->addTrainingSet(seq2);
...
ghmm_trainer->addTrainingSet(seqN);
ghmm_trainer->train(algorithm_tag, ...);

/* GHMM::train code */
{
  for (auto labeled_seq : split_sequence(ghmm_trainer->training_set())) {
    ghmm_trainer->state(labeled_seq.index())->add_training_set(labeled_seq.sequence());
  }
  for (auto state_trainer : ghmm_trainer->trainers()) {
    state_trainer->train(...) /* How to pass these parameters? */
  }
  /* train ghmm transitions */
}
@renatocf renatocf assigned renatocf and unassigned renatocf Sep 23, 2015
@renatocf
Copy link
Member Author

@igorbonadio,

So, I figured out an idea that can solve our problem. It will allow us to keep the method train receiving parameters, without (explicitly) creating any parameter objects. However, I need to know: how will trainers (#10) communicate with ToPS language (#11) to create a GHMM? How a tops::lang AST will get values at runtime and pass them to the different train methods?

Before making further implementations, I believe we should think a little bit more about how both things will dialogue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant