Welcome to iraf.net Tuesday, May 21 2024 @ 07:10 PM GMT


 Forum Index > Help Desk > Applications New Topic Post Reply
 Register a very large datacube ?
   
bs_nuig
 05/29/2007 04:05PM (Read 4372 times)  
+----
Newbie

Status: offline


Registered: 05/29/2007
Posts: 2
Hi,I am trying to register a large data cube; number of images(512x512) = 1000 or upwards. I have tested "xregister" but as the help pages say, it will only accept 1D or 2D images.. Similarly with "imalign" ..I could slice the data cube up using "imslice" task, but these data cubes are ~ 1 GB in size and would rather not split these cubes up if I can help it. This would slow my pipeline and create a large number of files Is there any task or number of tasks, that could be used to preform registration of this kind ?I am trying to do x-y shifts on the images no rotation..Any help would be great..Thanks in advance,
Brendan

 
Profile Email
 Quote
fitz
 05/29/2007 04:05PM  
AAAAA
Admin

Status: offline


Registered: 09/30/2005
Posts: 4040
Hi Brendan,There's no task I know of that will specifically process a data cube, but you can still do it using a loop over image sections that address each band of the cube as a separate 2-D image. The big questions are whether you know the shift you want to apply to each plane already, or whether you want to register each plane against a particular reference image (or plane, or another cube).Assuming for example you want to register all the planes in the cube against the first one, try something like[code:1:dd849d55e4]
int nplanes
string plane
reset clobber = yeshselect ("cube.fits", "i_naxis3", yes) | scan (nplanes)
imcopy ("cube.fits[*,*,1]", "ref.fits") # copy reference plane
for (i=2; i <= nplanes; i=i+1) {
plane = "cube.fits[*,*," // i // "]"
xregister(plane, "ref", "[*,*]", "shifts.db")
fields ("shifts.db", "2,3", > "shifts")
imshift (plane, plane, shifts_file="shifts")
}
[/code:1:dd849d55e4]Here you use HSELECT to get the number of planes and copy out the first plane as a reference image. W/in the loop, you define a 'plan' variable using the image section to specify the plane being used, then call XREGISTER to compute the shift, FIELDS to extract the shift, and IMSHIFT to apply the change and write the result back to the cube.You'll likely need to tweak this a bit, I haven't tried it as working code but it demonstrates all the parts I think you'll need. Note the 'clobber' set so you can overwrite fixed file names, in practice you should clean these up in the loop. Also, I'm sure that at some point it may be faster to IMSLICE the cube and process the individual images and then IMSTACK them back into a cube rather than beating on the FITS file in this manner, you'll need to figure out where that tipping point is. Hope this helps.Cheers,
-Mike

 
Profile Email
 Quote
jturner
 05/29/2007 04:05PM  
+++++
Active Member

Status: offline


Registered: 12/29/2005
Posts: 165
Hi Brendan,If you use PyRAF, I have a script that co-adds datacubes with offsets, if that's what you want. It reads the whole cubes into memory, however, so if they are 1GB each you may need a lot of memory to do that compared with what Mike is suggesting.Cheers,James.

 
Profile Email
 Quote
bs_nuig
 05/29/2007 04:05PM  
+----
Newbie

Status: offline


Registered: 05/29/2007
Posts: 2
Hi,Thanks for the help.. I am using Pyraf to write the pipeline and I found that the code suggested by Mike worked practically straight off. The only thing was that it seems that imshift will not write back into the same data cube image plane - it will just replace the entire data cube with the registered image.. So I ended up splitting my data cube. But it not all lost in terms of speed.. I used xregister to do two things at once - to split the data cube and at the same time register against the first image in the cube. This saves first splitting the cube with imslice.Also I found that upon co-addition of my registered images, that imcombine works [i:a7574eb373]very much quicker[/i:a7574eb373] if the images are not in data-cube format !!James, as some of the data is larger than 1GB and given my machines memory, I will stick with splitting them up for the moment.. - Thanks for the offer.Cheers
Brendan

 
Profile Email
 Quote
   
Content generated in: 0.11 seconds
New Topic Post Reply

Normal Topic Normal Topic
Sticky Topic Sticky Topic
Locked Topic Locked Topic
New Post New Post
Sticky Topic W/ New Post Sticky Topic W/ New Post
Locked Topic W/ New Post Locked Topic W/ New Post
View Anonymous Posts 
Anonymous users can post 
Filtered HTML Allowed 
Censored Content 
dog allergies remedies cialis 20 mg chilblain remedies


Privacy Policy
Terms of Use

User Functions

Login