good things to know about how indexing works
The indices for a Vector, or a column or row of a Matrix start at *1*
```
length(avector) # gets the number of elements in avector
avector[1] # gets the first item in avector
avector[end] # gets the final item in avector
avector[1:end] # gets all elements of avector
int_column_vector = [10, 20, 30]
10
20
30
int_column_vector[1]
10
# do not use zero as an index
int_column_vector[ 0 ]
ERROR: BoundsError:
# do not use false, true as indices because avec[ false ] means avec[ 0 ]
```
in ` w[1,(w[1].<z)&(w[1].>-(z))] = 0 `, the second index can simplify to
`false` (consider this)
```
avec = [ 10, 20, 30 ]
avec1 = avec[ 1 ]
avec1 == avec[ 1 + false ]
avec2 = avec[ 2 ]
avec2 == avec[ 1 + true ]
```
As a start, recheck indexing expressions, be more sure they do what you
want them to do.
On Wednesday, November 16, 2016 at 1:36:57 PM UTC-5, Patrik Waldmann wrote:
>
> Hi,
>
> I'm an R user trying to learn Julia. I got hold of some code from the Knet
> package that I was playing around with. My goal is to set values to zero in
> a loop based on a logical expression, but I cannot figure out how the
> indexing works. Any help would be appreciated (the problem lies in
> w[1,(w[1].<z)&(w[1].>-(z))] = 0):
>
> using Knet
> predict(w,x) = w[1]*x .+ w[2]
> lambda = 2
> z = Array{Float64}(1,13)
> loss(w,x,y) = sumabs2(y - predict(w,x)) / size(y,2)
> lossgradient = grad(loss)
> function train(w, data; lr=.1)
> for (x,y) in data
> dw = lossgradient(w, x, y)
> z[:] = lr * lambda
> w[1] -= lr * dw[1]
> w[2] -= lr * dw[2]
> w[1,(w[1].<z)&(w[1].>-(z))] = 0
> end
> return w
> end
> url = "
> https://archive.ics.uci.edu/ml/machine-learning-databases/housing/housing.data
> "
> rawdata = readdlm(download(url))
> x = rawdata[:,1:13]'
> x = (x .- mean(x,2)) ./ std(x,2)
> y = rawdata[:,14:14]'
> w = Any[ 0.1*randn(1,13), 0 ]
> niter = 25
> lossest = zeros(niter)
> for i=1:niter; train(w, [(x,y)]); lossest[i]=loss(w,x,y); end
>
>
> Best regards,
>
> Patrik
>