Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please consider improvement for page - 1 #6

Open
burlachenkok opened this issue Jan 28, 2022 · 0 comments
Open

Please consider improvement for page - 1 #6

burlachenkok opened this issue Jan 28, 2022 · 0 comments

Comments

@burlachenkok
Copy link

  1. In the CLI, you can add a requirement for a variance limit and math. the expectation is bounded. The Cauchy distribution is sometimes encountered in engineering in the laser industry.

  2. Bias - an error with the choice of a set of functions from which you choose, the objective function is far from the projection on this set of functions that you are considering.

  3. Variance - associated with the instability of the solutions obtained based on sampling examples from the population.
    For (2) and (3) better read the Statistical Learning book, the definitions from CheatSheet are too vague at least for me.

  4. In non-parametric models, these are models where there are no training parameters. For example, the nearest neighbor method. (Or more sophisticated models from MetaLearning that use in one of it's flavor non-parametric blocks)

  5. Cross-Validation - does not do any validation, rather it is an assessment. If you want validation (in the sense of guarantees) - this principle is not suitable.

  6. In Subset selection you put 0-norm, that's very bad jargon. Zero is not the norm. It is not homogeneous (multiplying a vector by 10 will not multiply a value by 10). Use the word cardinality(.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant