Julia v0.4.0 is still about a month a way (I think). JuMP (and JuliaOpt in
general) support both v0.3 and v0.4 for now, so you can try out v0.4 now if
you want. JuMP v0.10.x will probably be the last series of releases for
JuMP on v0.3, depending on when v0.4 comes out.

On Thu, Sep 3, 2015 at 6:27 AM, Leonardo Taccari <[email protected]> wrote:

> Great job!
>
> By the way, what's the deal with the *official* switch to Julia 0.4? Is it
> going to happen soon? I'm not really following the development of the
> language, and I don't even know what's going to change. I hope it won't
> break too many things. :-)
>
>
> Il giorno martedì 1 settembre 2015 06:41:21 UTC+2, Miles Lubin ha scritto:
>>
>> The JuMP team is happy to announce the release of JuMP 0.10.
>>
>> This is a major release with the greatest amount of new functionality
>> since the addition of nonlinear modeling last year. This will likely be the
>> last major release of JuMP to support Julia 0.3. Thanks to the heroic work
>> of Joey Huchette, JuMP now supports *vectorized syntax* and modeling for 
>> *semidefinite
>> programming*.
>>
>> You can now write, for example:
>>
>> @defVar(m, x[1:5])
>> @addConstraint(m, A*x .== 0)
>>
>> where A is a Julia matrix (dense or sparse). Note that we require dot
>> comparison operators .== (and similarly .<= and .>=) for vectorized
>> constraints. The vectorized syntax extends to quadratic but not general
>> nonlinear expressions.
>>
>> An important new concept to keep in mind is that this vectorized syntax
>> only applies to sets of variables which are one-based arrays. If you
>> declare variables indexed by more complicated sets, e.g.,
>>
>> @defVar(m, y[3:5])
>> s = [:cat, :dog, :pizza]
>> @defVar(m, z[s])
>>
>> then dot(y,z) and rand(3,3)*z are undefined. A result of this new
>> concept of one-based arrays is that x above now has the type
>> Vector{JuMP.Variable}. In this case, getValue() now returns a
>> Vector{Float64} instead of an opaque JuMP object. We hope users find
>> this new distinction between one-indexed array variables and all other
>> symbolically indexed variables useful and intuitive (if not, let us know).
>>
>> For semidefinite modeling, you can declare variables as SDP matrices and
>> add LMI (linear matrix inequality) constraints as illustrated in the
>> examples for minimal ellipse
>> <https://github.com/JuliaOpt/JuMP.jl/blob/6e7c86acfe09c4970741d957e381446bfd7630ca/examples/minellipse.jl>
>>  and
>> max cut
>> <https://github.com/JuliaOpt/JuMP.jl/blob/6e7c86acfe09c4970741d957e381446bfd7630ca/examples/maxcut_sdp.jl>,
>> among others.
>>
>> We also have a *new syntax for euclidean norms:*
>>
>> @addConstraint(m, norm2{c[i]*x[i]+b[i],i=1:N} <= 10)
>> # or
>> @addConstraint(m, norm(c.*x+b) <= 10)
>>
>> You may be wondering how JuMP compares with Convex.jl given these new
>> additions. Not much has changed philosophically; JuMP directly translates
>> SDP constraints and euclidean norms into the sparse matrix formats as
>> required by conic solvers. Unlike Convex.jl, *JuMP accepts only
>> standard-form SDP and second-order conic constraints and will not perform
>> any automatic transformations* such as modeling nuclear norms, minimum
>> eigenvalue, geometric mean, rational norms, etc. We would recommend using
>> Convex.jl for easy modeling of such functions. Our focus, for now, is on
>> the large-scale performance and stability of the huge amount of new syntax
>> introduced in this release.
>>
>> Also notable in this release:
>> - JuMP models now store a dictionary of attached variables, so that you
>> can look up a variable from a model by name by using the new getVar()
>> method.
>> - On Julia 0.4 only, you can now have a filter variable declarations,
>> e.g.,
>> @defVar(m, x[i=1:5,j=1:5; i+j >= 3])
>> will only create variables for the indices which satisfy the filter
>> condition. (These are not one-based arrays as introduced above.)
>> - Dual multipliers are available for nonlinear problems from the solvers
>> which provide them
>> - There is improved documentation for querying derivatives from a
>> nonlinear JuMP model
>> <http://jump.readthedocs.org/en/latest/nlp.html#querying-derivatives-from-a-jump-model>
>> - *We now try to print warnings for two common performance traps*:
>> calling getValue() in a tight loop and using operator overloading to
>> construct large JuMP expressions. Please let us know if these are useful or
>> annoying or both so that we can tune the warning thresholds.
>> - Thanks to Tony Kelman and Jack Dunn, you can now call a large number of
>> external solvers including Bonmin and Couenne through either the .osil or
>> .nl exchange formats.
>> - Module precompilation speeds up using JuMP considerably, for those on
>> Julia 0.4
>>
>> The delay since the last release of JuMP is mostly due to us trying to
>> test and refine the new syntax, but inevitably some bugs have slipped
>> through, so please let us know of any incorrect or confusing behavior.
>>
>> Also newsworthy is our new paper <http://arxiv.org/abs/1508.01982>
>> describing the methods used in JuMP with benchmark comparisons to existing
>> open-source and commercial optimization modeling software.
>>
>> Miles, Iain, and Joey
>>
>> --
> You received this message because you are subscribed to the Google Groups
> "julia-opt" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> Visit this group at http://groups.google.com/group/julia-opt.
> For more options, visit https://groups.google.com/d/optout.
>



-- 
*Iain Dunning*
PhD Candidate <http://orc.scripts.mit.edu/people/student.php?name=idunning>
 / MIT Operations Research Center <http://web.mit.edu/orc/www/>
http://iaindunning.com  /  http://juliaopt.org

Reply via email to