/

Default Site
  • Change Site
  • Default Site
  • Advanced Search
  • Expert Search
  • Sign In
    • Help
    • Search History
    • Clear Session
  • Browse
    • Entire Repository  
    • Recent Additions
    • Communities & Collections
    • By Title
    • By Creator
    • By Subject
    • By Type
    • Most Accessed Papers
    • Most Accessed Items
    • Most Accessed Authors
  • Quick Collection  
Sign In
  • Help
  • Search History
  • Clear Session

Showing items 1 - 2 of 2

Your selections:

  • Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219
Facets
  • Title
  • Creator
  • Date

Nonsmooth DC programming approach to clusterwise linear regression : Optimality conditions and algorithms

- Bagirov, Adil, Ugon, Julien

  • Authors: Bagirov, Adil , Ugon, Julien
  • Date: 2018
  • Type: Text , Journal article
  • Relation: Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219
  • Relation: http://purl.org/au-research/grants/arc/DP140103213
  • Full Text: false
  • Reviewed:
  • Description: The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets. © 2017 Informa UK Limited, trading as Taylor & Francis Group.

Nonsmooth DC programming approach to clusterwise linear regression : Optimality conditions and algorithms

- Bagirov, Adil, Ugon, Julien

  • Authors: Bagirov, Adil , Ugon, Julien
  • Date: 2018
  • Type: Text , Journal article
  • Relation: Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219
  • Full Text: false
  • Reviewed:
  • Description: The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets.

  • «
  • ‹
  • 1
  • ›
  • »
  • English (United States)
  • English (United States)
  • Disclaimer
  • Privacy
  • Copyright
  • Contact
  • Federation Library
  • Federation ResearchOnline policy
  • About Vital

‹ › ×

    Clear Session

    Are you sure you would like to clear your session, including search history and login status?