formatted output

2013-05-07 Thread Sudheer Joseph
Dear members,
I need to print few arrays in a tabular form for example below 
array IL has 25 elements, is there an easy way to print this as 5x5 comma 
separated table? in python

IL=[]
for i in np.arange(1,bno+1):
   IL.append(i)
print(IL)
%
in fortran I could do it as below
%
integer matrix(5,5)
   in=0
  do, k=1,5
  do, l=1,5
   in=in+1
  matrix(k,l)=in
  enddo
  enddo
  m=5
  n=5
  do, i=1,m
  write(*,"(5i5)") ( matrix(i,j), j=1,n )
  enddo
  end
 
-- 
http://mail.python.org/mailman/listinfo/python-list


netcdF4 variables

2013-05-31 Thread Sudheer Joseph
Dear members,
I have been using python NetcdF for some time. I understand 
that we can get variables from a netcdf one by one by using
temp=ncf.variable['temp'][:]
but is there  a way to get a list of variables with out the rest of the stuff 
as seen below?
some hing like a list
xx=nc,variables[:]
should get me all variable names with out other surrounding stuff??
with best regards.
Sudheer

In [4]: ncf.variables
Out[4]: OrderedDict([(u'LON', ), (u'LAT', 
), (u'DEPTH1_1', ), (u'TAX', ), (u'DIF_FD1', 
), (u'DIF_FD2', ), (u'DIF_FD3', ), 
(u'DIF_FD4', ), (u'DIF_FD5', 
), (u'DEPTH', ), (u'DEPTH_bnds', ), (u'TIME', 
), (u'TEMP_BIAS', )])
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: netcdF4 variables

2013-06-01 Thread Sudheer Joseph

Thank you very much it works for me.
with best regards,
Sudheer
On Saturday, June 1, 2013 12:51:01 PM UTC+5:30, Andreas Perstinger wrote:
> On 01.06.2013 05:30, Sudheer Joseph wrote:
> 
> > some hing like a list
> 
> > xx=nc,variables[:]
> 
> > should get me all variable names with out other surrounding stuff??
> 
> >
> 
> > In [4]: ncf.variables
> 
> > Out[4]: OrderedDict([(u'LON', ),
> 
> [SNIP]
> 
> 
> 
> It looks like "variables" is an OrderedDict. Thus
> 
> 
> 
>  >>> ncf.variables.keys()
> 
> 
> 
> should return a view (or list, depending on your python version) of all 
> 
> keys, i.e. all variable names.
> 
> 
> 
> Bye, Andreas

-- 
http://mail.python.org/mailman/listinfo/python-list


python netcdf

2013-06-05 Thread Sudheer Joseph
Dear Members,
  Is there a way to get the time:origin attribute from a netcdf 
file as string using the Python netcdf?
with best regards,
Sudheer
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: python netcdf

2013-06-05 Thread Sudheer Joseph
Thank you very much Jason
With best regards
Sudheer

On Thursday, June 6, 2013, Jason Swails wrote:

>
>
>
> On Wed, Jun 5, 2013 at 9:07 PM, Sudheer Joseph 
> 
> > wrote:
>
>> Dear Members,
>>   Is there a way to get the time:origin attribute from a
>> netcdf file as string using the Python netcdf?
>>
>
> Attributes of the NetCDF file and attributes of each of the variables can
> be accessed via the dot-operator, as per standard Python.
>
> For instance, suppose that your NetCDF file has a Conventions attribute,
> you can access it via:
>
> ncfile.Conventions
>
> Suppose that your variable, time, has an attribute "origin", you can get
> it via:
>
> ncfile.variables['time'].origin
>
> Of course there's the question of what NetCDF bindings you're going to
> use.  The options that I'm familiar with are the ScientificPython's
> NetCDFFile class (Scientific.IO.NetCDF.NetCDFFile), pynetcdf (which is just
> the ScientificPython's class in a standalone format), and the netCDF4
> package.  Each option has a similar API with attributes accessed the same
> way.
>
> An example with netCDF4 (which is newer, has NetCDF 4 capabilities, and
> appears to be more supported):
>
> from netCDF4 import Dataset
>
> ncfile = Dataset('my_netcdf_file.nc', 'r')
>
> origin = ncfile.variables['time'].origin
>
> etc. etc.
>
> The variables and dimensions of a NetCDF file are stored in dictionaries,
> and the data from variables are accessible via slicing:
>
> time_data = ncfile.variables['time'][:]
>
> The slice returns a numpy ndarray.
>
> HTH,
> Jason
>


-- 
Sent from my iPad Mini
-- 
http://mail.python.org/mailman/listinfo/python-list


writing fortran equivalent binary file using python

2013-11-13 Thread Sudheer Joseph
Hi,
 I need to write a binary file exactly as written by fortran code below 
to be read by another code which is part of a model which is not advisable to 
edit.I would like to use python for this purpose as python has mode flexibility 
and easy coding methods.
  
  character(40) :: TITLE="122322242"
  integer :: IWI,JWI
  real :: XFIN,YFIN,DXIN=0.5,DYIN=0.5,WDAY(6000)
  XFIN=0.0,YFIN=-90.0,NREC=1461,DXIN=0.5;DYIN=0.5;IWI=720;JWI=361 
  real,allocatable,dimension(:,:,:) :: VAR1_VAL
  real,allocatable,dimension(:,:,:) :: VAR2_VAL

  open(11,file=outf,form='UNFORMATTED')
  WRITE(11) TITLE
  WRITE(11) NX,NY,XFIN,YFIN,DXIN,DYIN,NREC,WDAY
  write(*,'(A10,2f10.3)') "START=",VAR1_VAL(1,1,1),VAR2_VAL(1,1,1)
  write(*,'(A10,2f10.3)') "END=",VAR1_VAL(nx,ny,nrec),VAR2_VAL(nx,ny,nrec)
  do i=1,NREC
  WRITE(11) VAR1_VAL(:,:,i),VAR2_VAL(:,:,i)
  WRITE(*,'(2I10,f10.3)') NX,NY,WDAY(i)
  enddo

My trial code with Python (data is read from file here)

from netCDF4 import Dataset as nc
import numpy as np
XFIN=0.0,YFIN=-90.0,NREC=1461,DXIN=0.5;DYIN=0.5
TITLE="NCMRWF 6HOURLY FORCING MKS"
nf=nc('ncmrwf_uv.nc')
ncv=nf.variables.keys()
IWI=len(nf.variables[ncv[0]])
JWI=len(nf.variables[ncv[1]])
WDAY=nf.varlables[ncv[2]][0:NREC]
U=nf.variables[ncv[3]][0:NREC,:,:]
V=nf.variables[ncv[4]][0:NREC,:,:]
bf=open('ncmrwf_uv.bin',"wb")
f.write(TITLE)
f.write(IWI,JWI,XFIN,YFIN,DXIN,DYIN,NREC,WDAY)
for i in np.arange(0,NREC):
f.write(U[i,:,:],V[i,:,:])
f.close()

But the issue is that f.write do not allow multiple values( it allows one by 
one so throws an error with above code ) on same write statement like in the 
fortran code. experts may please advice if there a solution for this?

with best regards,
Sudheer
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: writing fortran equivalent binary file using python

2013-11-14 Thread Sudheer Joseph
Thank you,
  But it wont allow to write it in unformatted way so
that the fortran code can read

with

open(11,file="input.bin")
read(11) IWI,JWI,XFIN,YFIN,DXIN,DYIN,NREC,WDAY

with best regards,
sudheer


On Thu, Nov 14, 2013 at 7:48 PM, Oscar Benjamin
wrote:

> On 14 November 2013 00:53, Sudheer Joseph  wrote:
> > My trial code with Python (data is read from file here)
> >
> > from netCDF4 import Dataset as nc
> > import numpy as np
> > XFIN=0.0,YFIN=-90.0,NREC=1461,DXIN=0.5;DYIN=0.5
> > TITLE="NCMRWF 6HOURLY FORCING MKS"
> > nf=nc('ncmrwf_uv.nc')
> > ncv=nf.variables.keys()
> > IWI=len(nf.variables[ncv[0]])
> > JWI=len(nf.variables[ncv[1]])
> > WDAY=nf.varlables[ncv[2]][0:NREC]
> > U=nf.variables[ncv[3]][0:NREC,:,:]
> > V=nf.variables[ncv[4]][0:NREC,:,:]
> > bf=open('ncmrwf_uv.bin',"wb")
> > f.write(TITLE)
> > f.write(IWI,JWI,XFIN,YFIN,DXIN,DYIN,NREC,WDAY)
> > for i in np.arange(0,NREC):
> > f.write(U[i,:,:],V[i,:,:])
> > f.close()
> >
> > But the issue is that f.write do not allow multiple values( it allows
> one by one so throws an error with above code ) on same write statement
> like in the fortran code. experts may please advice if there a solution for
> this?
>
> Can you just call write twice? e.g.:
>
> f.write(U[i,:,:])
> f.write(V[i,:,:])
>
>
> Oscar
>



-- 
with best regards

Sudheer

**
Dr. Sudheer Joseph

Scientist

INDIAN NATIONAL CENTRE FOR OCEAN INFORMATION SERVICES (INCOIS)
MINISTRY OF EARTH SCIENCES, GOVERNMENT OF INDIA
"OCEAN VALLEY" PRAGATHI NAGAR (BO)
OPP.JNTU, NIZAMPET SO
Andhra Pradesh, India. PIN- 500 090.
TEl:+91-40-23044600(R),Tel:+91-9440832534(Mobile)
Tel:+91-40-23886047(O),Fax:+91-40-23892910(O)
E-mail: sudheer.jos...@yahoo.com;  s...@incois.gov.in.
Web- http://oppamthadathil.tripod.com
   --* ---
"The ultimate measure of a man is
not where he stands in moments of
comfort and convenience, but where
he stands at times of challenge and
controversy."
Martin Luther King, Jr.
"What we have done for ourselves alone dies with us.
What we have done for others and the world remains and is immortal."
- Albert Pines
-- 
https://mail.python.org/mailman/listinfo/python-list


finding masking boundary indices

2013-11-23 Thread Sudheer Joseph
Hi,
   I have a masked array like in the attached link, I wanted to find 
indices of the bounds where the mask is false ie in this case of depth file 
where there is depth less than shore. Is there a pythonic way of finding the 
boundary indices? please advice?
https://drive.google.com/file/d/0B3heUQNme7G5d2dYZzgxTG1NdG8/edit?usp=sharing
-- 
https://mail.python.org/mailman/listinfo/python-list


refresing the edited python function

2013-08-18 Thread Sudheer Joseph


Hi,
I have been using ipython and ipython with qtconsole and working on a code with 
functions. Each time I make a modification in function  

I have to quit IPTHON console (in both with and with out qt console ) and 
reload the function freshly. If I need to see the changed I made in the 
function. I tried below options
del function name

import the module again  by issuing "from xxx.py import yy"
import xxx.py
make changes
reload(xxx.py)
this
 works only if the the function in the code has same name as the code. 
But even this do not reflect the changes made by editing the code.
So what is the standard way to update the function for further tests after an 
edit?
with best regards,
Sudheer 
 
*******
Sudheer Joseph 
Indian National Centre for Ocean Information Services
Ministry of Earth Sciences, Govt. of India
POST BOX NO: 21, IDA Jeedeemetla P.O.
Via Pragathi Nagar,Kukatpally, Hyderabad; Pin:5000 55
Tel:+91-40-23886047(O),Fax:+91-40-23895011(O),
Tel:+91-40-23044600(R),Tel:+91-40-9440832534(Mobile)
E-mail:sjo.in...@gmail.com;sudheer.jos...@yahoo.com
Web- http://oppamthadathil.tripod.com
***-- 
http://mail.python.org/mailman/listinfo/python-list


Re: refresing the edited python function

2013-08-19 Thread Sudheer Joseph
Thank you Dieter,
 I never thought it will be so difficult task, All I was 
thinking was that, I just do not know how it is done. I wonder how the code 
developers work in this case every time a function is modified one has to 
restart the console is a nightmare... Hope one day some solution will be 
evolved.
with best regards,
Sudheer




>
> From: dieter 
>To: python-list@python.org 
>Sent: Monday, 19 August 2013 11:48 AM
>Subject: Re: refresing the edited python function
> 
>
>Sudheer Joseph  writes:
>
>> I have been using ipython and ipython with qtconsole and working on a code 
>> with functions. Each time I make a modification in function  
>>
>> I have to quit IPTHON console (in both with and with out qt console ) and 
>> reload the function freshly. If I need to see the changed I made in the 
>> function. I tried below options
>> del function name
>>
>> import the module again  by issuing "from xxx.py import yy"
>> import xxx.py
>> make changes
>> reload(xxx.py)
>> this
>>  works only if the the function in the code has same name as the code. 
>> But even this do not reflect the changes made by editing the code.
>> So what is the standard way to update the function for further tests after 
>> an edit?
>
>Getting changes into a running application is difficult.
>Python has not been designed to make this easy.
>
>The "reload" above is one partial way to achieve something like this.
>The "reload" causes the module to be reloaded. If you have changed
>the modules code, these changes will be reflected *inside* the reloaded
>module. However, other modules may have imported objects from
>this module (as  in your "from xxx.py import yy"). To see changes
>in those objects, they, too, must repeat the import (otherwise,
>they continue to use the old, unchanged object).
>
>There is an additional approach, used e.g. by "plone.reload".
>In this approach, the objects are modified "in place". All usage
>points of the modified object will see changes.
>However, there are (quite severe) limitations to what changes
>can be made "in place". Thus, this, too, does not give a complete
>solution.
>
>In simple cases, one of those approaches can avoid a restart
>after modifications. However, in general, a restart is required.
>
>-- 
>http://mail.python.org/mailman/listinfo/python-list
>
>-- 
http://mail.python.org/mailman/listinfo/python-list


Re: refresing the edited python function

2013-08-19 Thread Sudheer Joseph
- Original Message -

> From: Dave Angel 
> To: python-list@python.org
> Cc: 
> Sent: Monday, 19 August 2013 4:45 PM
> Subject: Re: refresing the edited python function
> 
> Sudheer Joseph wrote:
> 
>>  Thank you Dieter,
>>   I never thought it will be so difficult task, All I 
> was thinking was that, I just do not know how it is done. I wonder how the 
> code 
> developers work in this case every time a function is modified one has to 
> restart the console is a nightmare... Hope one day some solution will be 
> evolved.
>>  with best regards,
>>  Sudheer
>> 
> 
> Please don't top-post, and please use text messages, rather than html
> mail, when posting on this list.
> 
> Seems to me your problem is with ipython's IDE, not with Python.  Python
> requires you to rerun your application when making most changes to code.
> But it doesn't say anything about restarting a "console," whatever 
> that
> is in this context.  I use Komodo IDE when i want an IDE functionality,
> and never restart Komodo, over hours of work.
> 
> 
Thank you Dave,
I will make sure that when I post next time. 
with best regards,
Sudheer

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: refresing the edited python function

2013-08-21 Thread Sudheer Joseph


Thank you,
    But I wish if there was a foolproof reload
with best regards,
Sudheer

- Original Message -
> From: Jean-Michel Pichavant 
> To: Sudheer Joseph 
> Cc: python-list@python.org
> Sent: Tuesday, 20 August 2013 10:07 PM
> Subject: Re: refresing the edited python function
> 
> 
> - Original Message - 
> 
>>  Hi,
>>  I have been using ipython and ipython with qtconsole and working on a
>>  code with functions. Each time I make a modification in function
> 
>>  I have to quit IPTHON console (in both with and with out qt console )
>>  and reload the function freshly. If I need to see the changed I made
>>  in the function. I tried below options
>>  del function name
> 
>>  import the module again by issuing "from xxx.py import yy"
>>  import xxx.py
>>  make changes
>>  reload(xxx.py)
>>  this works only if the the function in the code has same name as the
>>  code. But even this do not reflect the changes made by editing the
>>  code.
>>  So what is the standard way to update the function for further tests
>>  after an edit?
>>  with best regards,
>>  Sudheer
> 
> Hi,
> 
> My "standard" way ;) :
> 1/ create a file
> 2/ edit the code
> 3/ run ipython (with %pdb on)
> 4/ within ipython "run myfile.py"
> 5/ check / introspect /debug
> 6/ change the code
> 7/ exit ipython
> 8/ reenter ipython
> 9/ using the ipython shell history, reexecute the file (2 key press) and go 
> back 
> to 5/
> 
> I used to reload my objects, it's been useful until one time when I lost a 
> lot of time because of some nasty side effect. In the end it's not worth it. 
> Always quit the shell, always.
> 
> JM
> 
> 
> 
> -- IMPORTANT NOTICE: 
> 
> The contents of this email and any attachments are confidential and may also 
> be 
> privileged. If you are not the intended recipient, please notify the sender 
> immediately and do not disclose the contents to any other person, use it for 
> any 
> purpose, or store or copy the information in any medium. Thank you.
> 
-- 
http://mail.python.org/mailman/listinfo/python-list


memory management

2013-02-18 Thread Sudheer Joseph
HI,
I have been trying to compute cross correlation between a time series 
at a location f(1) and the timeseries of spatial data f(XYT) and saving the 
resulting correlation coefficients and lags in a 3 dimensional array which is 
of fairly big size. Though the code I made for this purpose works up to few 
iterations then it hangs due to apparent memory crunch. Can anybody suggest a 
better way to handle this situation so that the computation and data storing 
can be done with out hangups. Finally I intend to save the data as netcdf file 
which is not implemented as of now. Below is the piece of code I wrote for this 
purpose.

from mpl_toolkits.basemap import Basemap as bm, shiftgrid, cm
import numpy as np
import matplotlib.pyplot as plt
from netCDF4 import Dataset
from math import pow, sqrt
import sys
from scipy.stats import t
indep=120
nlags=365
ncin = Dataset('qu_ru.nc', 'r')
lons = ncin.variables['LON421_600'][:]
lats = ncin.variables['LAT81_220'][:]
dep = ncin.variables['DEPTH1_29'][:]
adep=(dep==indep).nonzero()
didx=int(adep[0])
qu = ncin.variables['qu'][:,:,:]
#qv = ncin.variables['QV'][0,:,:]
ru = ncin.variables['ru'][:,didx,0,0]
ncin.close()
fig = plt.figure()
ax = fig.add_axes([0.1,0.1,0.8,0.8])
# use major and minor sphere radii from WGS84 ellipsoid.
m = bm(projection='cyl', llcrnrlon=30, llcrnrlat=-40,urcrnrlon=120, 
urcrnrlat=30)
# transform to nx x ny regularly spaced 5km native projection grid
nx = int((m.xmax-m.xmin))+1; ny = int((m.ymax-m.ymin)+1)
q=ru[1:2190]
qmean=np.mean(q)
qstd=np.std(q)
qnorm=(q-qmean)/qstd
lags3d=np.arange(731*140*180).reshape(731,140,180)
r3d=np.arange(731*140*180).reshape(731,140,180)
for i in np.arange(len(lons)):
   for j in np.arange(len(lats)):
  print i,j
  p=qu[1:2190,j,i].squeeze()
  p.shape
  pmean=np.mean(p)
  pstd=np.std(p)
  pnorm=(p-pmean)/pstd
  n=len(p)
#  fg=plt.figure()
  c=plt.xcorr(p,q,usevlines=True,maxlags=nlags,normed=True,lw=2)
  acp=plt.acorr(p,usevlines=True,maxlags=nlags,normed=True,lw=2)
  acq=plt.acorr(q,usevlines=True,maxlags=nlags,normed=True,lw=2)
  acp[1][nlags]=0
  acq[1][nlags]=0
  lags=c[0]
  r=c[1]
  lags3d[:,j,i]=lags
  r3d[:,j,i]=r 
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: memory management

2013-02-18 Thread Sudheer Joseph
> Python version and OS please.  And is the Python 32bit or 64bit?  How 
> 
> much RAM does the computer have, and how big are the swapfiles ?
> 
Python 2.7.3
ubuntu 12.04 64 bit
4GB RAM
> 
> "Fairly big" is fairly vague.  To some people, a list with 100k members 
> 
> is huge, but not to a modern computer.
I have a data loaded to memory from netcdf file which is 2091*140*180 grid 
points (2091 time, 140 latitude 180 longitude) apart from this I define a 2 3d 
arrays r3d and lags3d to store the output for writing out to netcdf file after 
completion. 
> 
> 
> How have you checked whether it's running out of memory?  Have you run 
> 
> 'top' on it?  Or is that just a guess?

I have not done this but the speed (assessed from the listing of grid i and j) 
get stopped after j=6 ie after running 6 longitude grids)
>
Will check the top as you suggested

Here is the result of top it used about 3gB memory

  PID USER  PR  NI  VIRT  RES  SHR S %CPU %MEMTIME+  COMMAND
 3069 sjo   20   0 3636m 3.0g 2504 D3 78.7   3:07.44 python  
> 
> I haven't used numpy, scipy, nor matplotlib, and it's been a long time 
> 
> since I did correlations.  But are you sure you're not just implementing 
> 
> an O(n**3) algorithm or something, and it's just extremely slow?
> 
Correlation do not involve such computation normally, I am not sure if 
internally python does some thing like that.
with best regards,
Sudheer
> 
> 
> 
> > from mpl_toolkits.basemap import Basemap as bm, shiftgrid, cm
> 
> > import numpy as np
> 
> > import matplotlib.pyplot as plt
> 
> > from netCDF4 import Dataset
> 
> > from math import pow, sqrt
> 
> > import sys
> 
> > from scipy.stats import t
> 
> 
> 
>   
> 
> 
> 
> -- 
> 
> DaveA
-- 
http://mail.python.org/mailman/listinfo/python-list