jagomart
digital resources
picture1_Training Methods Pdf 86562 | 6 Opt Item Download 2022-09-14 12-42-03


 154x       Filetype PDF       File size 0.55 MB       Source: www.cs.princeton.edu


File: Training Methods Pdf 86562 | 6 Opt Item Download 2022-09-14 12-42-03
numerical optimization techniques l eon bottou nec labs america cos 424 3 2 2010 today s agenda goals classication clustering regression other parametric vs kernels vs nonparametric representation probabilistic vs ...

icon picture PDF Filetype PDF | Posted on 14 Sep 2022 | 3 years ago
Partial capture of text on file.
       Numerical Optimization Techniques
                           L´eon Bottou
                           NEC Labs America
                       COS 424 – 3/2/2010
      Today’s Agenda
       Goals                   Classification, clustering, regression, other.
                               Parametric vs. kernels vs. nonparametric
       Representation          Probabilistic vs. nonprobabilistic
                               Linear vs. nonlinear
                               Deep vs. shallow
                               Explicit: architecture, feature selection
       Capacity Control        Explicit: regularization, priors
                               Implicit: approximate optimization
                               Implicit: bayesian averaging, ensembles
       Operational             Loss functions
       Considerations          Budget constraints
                               Online vs. offline
       Computational           Exact algorithms for small datasets.
       Considerations          Stochastic algorithms for big datasets.
                               Parallel algorithms.
      L´eon Bottou                     2/30                  COS 424 – 3/2/2010
      Introduction
      General scheme
      – Set a goal.
      – Define a parametric model.
      – Choose a suitable loss function.
      – Choose suitable capacity control methods.
      – Optimize average loss over the training set.
      Optimization
      – Sometimes analytic (e.g. linear model with squared loss.)
      – Usually numerical (e.g. everything else.)
      L´eon Bottou                     3/30                  COS 424 – 3/2/2010
      Summary
           1. Convex vs. Nonconvex
           2. Differentiable vs. Nondifferentiable
           3. Constrained vs. Unconstrained
           4. Line search
           5. Gradient descent
           6. Hessian matrix, etc.
           7. Stochastic optimization
      L´eon Bottou                      4/30                  COS 424 – 3/2/2010
The words contained in this file might help you see if this file matches what you are looking for:

...Numerical optimization techniques l eon bottou nec labs america cos today s agenda goals classication clustering regression other parametric vs kernels nonparametric representation probabilistic nonprobabilistic linear nonlinear deep shallow explicit architecture feature selection capacity control regularization priors implicit approximate bayesian averaging ensembles operational loss functions considerations budget constraints online oine computational exact algorithms for small datasets stochastic big parallel introduction general scheme set a goal dene model choose suitable function methods optimize average over the training sometimes analytic e g with squared usually everything else summary convex nonconvex dierentiable nondierentiable constrained unconstrained line search gradient descent hessian matrix etc...

no reviews yet
Please Login to review.