Monday, November 21, 2011
Unrest at UC Davis
I have to admit that I barely got any work done today. Most of my time was spent thinking about the recent actions at UC Davis (horrible on the part of police and administrators and amazing on the part of faculty and students). The day began with an open meeting in our the Physics Department chair's office (attended by faculty and students) followed by a campus wide rally that involved thousands of students, faculty and community members. I am impressed by the outstanding behavior of my fellow students and comforted by how much the faculty care for the students and are willing to stand side-by-side with them.
Thursday, November 17, 2011
Peak Error Estimate
Today I was finalizing making update figures for the DLSCL J0916.2+2953 letter and one of the details had to do with adding peak error bars to the weak lensing mass maps. First I estimate the peak using an analysis that takes advantage of all the observed galaxies. Then to estimate the variance I use weak lensing maps that are the product of a bootstrap sample of the observed galaxies. Taking extraction regions around each subcluster I then measure the distribution of peaks in the bootstrap sample. See the screen capture from my research note book (side note: I highly recommend Microsoft OneNote).
![]() |
| Weak Lensing Peak Variance Estimate Notes |
This gives me a distribution in each peaks RA and Dec. For example the y-pixel coordinate of the Subaru weak lensing southern peak is shown below. The dashed lines represent the 1 sigma deviations and the blue and red lines are just different statistics.
Finally this information is plotted on the weak lensing mass map as blue cross-hairs.
Labels:
DLS,
DLSCL J0916.2+2951,
papers,
statistics,
Subaru,
weak lensing
Wednesday, November 16, 2011
More Referee Comments Addressed
Spent most of the day addressing the referee's comments and updating the DLSCL J0916.2+2953 paper. I am about 75% of the way through them and should have them all addressed by the end of the week.
There was also an interesting talk by the John Felde at the UC Davis Physics Grad Colloquium about measuring the theta_13 mixing angle at Double Chooz experiment. I am particularly interested by the possibility of detecting additional mixing angles due to more than the 3 expected neutrinos when their near detector comes on line to complement their far detector. This is possible by examining the near and far signals and studying the ratio. This is still a few years off and might even be a few experiments off but there seems to be potential.
Tuesday, November 15, 2011
Improved Peak Measurements
Part of the day was spent helping Dave put the finishing touches on his NSF proposal. I think it turned out pretty good. The main focus of the proposal was on weak lensing shear peaks, clusters and filaments. Here's hoping that it gets accepted.
A large part of the day was improving my measurement of the shear peak centers' errors for DLSCL J0916.2+2951.
Another interesting note is that Sophie Maurogordato, whom I mentioned in a arXiv to the Rescue, is interested in applying the method that I developed to constrain the dynamics of cluster mergers to her merging cluster A2163. It is a great feeling to have something you developed be useful to someone else. This is actually the second time I will have applied my method to another cluster (the first was to a high redshift cluster pair that Brian Lemaux has been working on.
A large part of the day was improving my measurement of the shear peak centers' errors for DLSCL J0916.2+2951.
Another interesting note is that Sophie Maurogordato, whom I mentioned in a arXiv to the Rescue, is interested in applying the method that I developed to constrain the dynamics of cluster mergers to her merging cluster A2163. It is a great feeling to have something you developed be useful to someone else. This is actually the second time I will have applied my method to another cluster (the first was to a high redshift cluster pair that Brian Lemaux has been working on.
Monday, November 14, 2011
When Noise Looks Like Signal
I was pretty excited when I first measured number distribution of galaxies for my filament candidate stack. In the figure titled Signal to the right white/red represent more galaxies continuing to purple/black representing fewer galaxies. Right where it is expected there were more galaxies (along the filaments). Now compare this this with the figure below titled Null. These two figures look very similar. This is a problem because in the bottom figure I have stacked clusters pairs that are so far separated in redshift space that they should definitely not have a filament between them.
First things first; understand why even in the null case there is what appears to be a filament signal. I have two primary suspects:
- Some of the postage stamp regions extend beyond the survey area. Thus when adding all the postage stamps together there will be fewer galaxies at the larger radii from the filament axis simply because not all postage stamps had observed galaxies at these larger radii.
- This is just due to the expected signal from two over lapping clusters.
Second things second. Correct for this systematic noise. Theoretically I know how to correct either of the above effects. Now practically may be another story.
Friday, November 11, 2011
Line of Sight
I have been working on addressing the referee's comments for our DLSCL J0916.2+2951 paper. One item that was crossed off the to do list was making a redshift histogram to show if there are any line-of-sight structures that may confuse our results.
The black histogram is our sample of spectroscopic redshifts (0.1<z<1.0) in the area of our cluster. The over-plotted red histogram is the subsample of galaxies that satisfy the photo-z cut 0.43<z<0.63, which we use to select likely cluster members. We want the red histogram to completely cover the black histogram at the cluster redshift of z=0.53 but not cover the black histogram at any other redshifts. Unfortunately this is not the case, otherwise photometric redshifts would be just as good as spectroscopic redshifts.
The good news is that there is not much structure along the line-of-sight. There is a small peak at z~0.6, and worse we include a lot of these galaxies in our photo-z cut. While it is a much smaller concentration than our cluster galaxies it is worth investigating. Looking at the projected distribution of these galaxies they are very evenly distributed across the area of the cluster and there is no apparent structure to these galaxies. Thus it should not affect our results in any significant way. It is curious though, I wonder if it is a wall?
Labels:
DLS,
DLSCL J0916.2+2951,
papers,
photo-z's,
spec-z's
Thursday, November 10, 2011
arXiv to the Rescue
![]() |
| DLSCL J0916.2+2951 |
Now that it has become apparent that I will not be able to complete my filament analysis (at least to any degree that I have confidence in the results) in time for the NSF deadline my focus has shifted to getting our letter on the merging galaxy cluster DLSCL J0916.2+2951 published.
First point of order, it turns out we left something out of our paper. Sophie Maurogordato read our paper on the arXiv and pointed out that we left off A2163 on our list of dissociative mergers. After reading her papers today it is clear that this is a dissociative merger and should be included in our paper.
The arXiv came through on one of its promises. My coauthors and I (even our referee) missed this cluster and had we not put our paper on the arXiv before the article was published this would have been an errata rather than an simple change to our current draft.
Wednesday, November 9, 2011
Dissapointment
Debugging a large program under a deadline is not very fun. As most people who program know, the debugging process is like a random walk. Once you fix one thing you find out that something else is broken. This process is compounded when you are writing a program to try to measure a never before measured signal. Then you don't know if you are not seeing the signal because it is really not there or if it is just an error in you code. To combat this I created a simple mock filament/cluster dataset, to test the code on a problem I know the answer to. Result:
This is the correct answer. The filament signal is the red stripe down the middle. Now compare this with the result from running my program on real data:
One of these things are not like the other! Clearly something has gone wrong in the analysis of the real data and clearly the problem is rooted in some aspect of the real data that I did not model in my simple analysis. The two primary suspects: masked regions in the survey, and full p(z).
I am afraid that I will not have results for the NSF deadline. I am generally an over optimistic person. I have come to terms with this, but it is still disappointing when I don't meet my expectations.
This is the correct answer. The filament signal is the red stripe down the middle. Now compare this with the result from running my program on real data:
One of these things are not like the other! Clearly something has gone wrong in the analysis of the real data and clearly the problem is rooted in some aspect of the real data that I did not model in my simple analysis. The two primary suspects: masked regions in the survey, and full p(z).
I am afraid that I will not have results for the NSF deadline. I am generally an over optimistic person. I have come to terms with this, but it is still disappointing when I don't meet my expectations.
Tuesday, November 8, 2011
Already Slipping
Lesson learned: you should never start anything on a Friday. By the time Monday rolled around I got carried away with debugging my Filament Analysis program and forgot to blog my work of the day. So at the risk of setting a bad precedence let me recap what happened yesterday as well as what went on today.
Monday:
I am trying to get some filament results for Dave's NSF proposal since the committee like that aspect of the proposal last year. I have rewritten my filament analysis code and just finished the bulk of the coding on Friday. I spent all day debugging (as well as attempting to debug) my new filament analysis program. The gist of this program is that I give it the positions (ra, dec, z) of galaxy clusters then use it to estimate likely locations of filaments, specifically between close cluster pairs, similar to Mead et al. 2010). I have to go about it this way because filaments are not nearly as over-dense as cluster and are very hard to detect where as clusters are relatively easy to detect. I then combine the signal from each of the filaments by rotating and stacking them. I am sure I will go into more details of this process later. Surprisingly one of the biggest difficulties of the day was obtaining and manipulating the galaxy catalog for the field I am analyzing (a 2x2 sq. deg. area, one of the DLS fields). I am using each galaxy's full photometric redshift probability distribution, p(z) for short, so instead of a single number representing each galaxy's redshift we have 500, which results in an ~3 Gb catalog. Most text editors choke on anything larger than 2 Gb so even mundane manipulations of this catalog (e.g. search and replace) are a challenge. vim came to the rescue though, as apparently it has no file size limitation.
Today:
More time spent debugging the code. At face value the program is running correctly except for handeling of the masked out regions (i.e. patches where we have no data), which in some instances are causing some infinities in error estimate calculations. I know how to fix this so hopefully the code will be up and cranking out results in time for the NSF proposal deadline (effectively at the end of this week).
On a separate note I got the referee's comments back on our submitted merging cluster paper. There were no major comments (e.g. "this work is crap you should quit science and become a preist") and only about 7 comments asking for a little more detail, which should only require some minor reanalysis. Hopefully we can get the addressed next week after the NSF deadline.
I mentioned this blog at our Cosmology Lunch today and it made for a pretty good and discussion as well as some pretty good jokes. On the subject of openness of research and potential for being scooped the main conclusion was that it is hard enough to get people interested in you work at all so there is probably little concern for people scooping you based on what is roughly posted to a blog.
Well this is obviously too long of a post and I can't possibly keep this up so I will probably have to make following posts more concise. It will also help if I don't double up on days like this post.
Monday:
I am trying to get some filament results for Dave's NSF proposal since the committee like that aspect of the proposal last year. I have rewritten my filament analysis code and just finished the bulk of the coding on Friday. I spent all day debugging (as well as attempting to debug) my new filament analysis program. The gist of this program is that I give it the positions (ra, dec, z) of galaxy clusters then use it to estimate likely locations of filaments, specifically between close cluster pairs, similar to Mead et al. 2010). I have to go about it this way because filaments are not nearly as over-dense as cluster and are very hard to detect where as clusters are relatively easy to detect. I then combine the signal from each of the filaments by rotating and stacking them. I am sure I will go into more details of this process later. Surprisingly one of the biggest difficulties of the day was obtaining and manipulating the galaxy catalog for the field I am analyzing (a 2x2 sq. deg. area, one of the DLS fields). I am using each galaxy's full photometric redshift probability distribution, p(z) for short, so instead of a single number representing each galaxy's redshift we have 500, which results in an ~3 Gb catalog. Most text editors choke on anything larger than 2 Gb so even mundane manipulations of this catalog (e.g. search and replace) are a challenge. vim came to the rescue though, as apparently it has no file size limitation.
Today:
More time spent debugging the code. At face value the program is running correctly except for handeling of the masked out regions (i.e. patches where we have no data), which in some instances are causing some infinities in error estimate calculations. I know how to fix this so hopefully the code will be up and cranking out results in time for the NSF proposal deadline (effectively at the end of this week).
On a separate note I got the referee's comments back on our submitted merging cluster paper. There were no major comments (e.g. "this work is crap you should quit science and become a preist") and only about 7 comments asking for a little more detail, which should only require some minor reanalysis. Hopefully we can get the addressed next week after the NSF deadline.
I mentioned this blog at our Cosmology Lunch today and it made for a pretty good and discussion as well as some pretty good jokes. On the subject of openness of research and potential for being scooped the main conclusion was that it is hard enough to get people interested in you work at all so there is probably little concern for people scooping you based on what is roughly posted to a blog.
Well this is obviously too long of a post and I can't possibly keep this up so I will probably have to make following posts more concise. It will also help if I don't double up on days like this post.
Friday, November 4, 2011
So I have been toying with the idea of blogging for awhile now but always thought that it would just be too much work. I used to keep a journal and I would spend a number of hours each week writing in it (very detailed accounts of almost everything). I simply don't have the time for that type of a journal/blog.
Then today I stumbled across David Hogg's research blog, and it has inspired me to keep a similar research blog. For one I am excited about my research and want to share this excitement. Plus I see some potential benefits in doing so: it will make me succinctly think about what I did that day, and it will keep an easily searchable record (I have often remembered a particular discussion or some work but couldn't recall the details). Finally, blogging about my research was suggested by my cousin-brother (Frank Rim) about a year ago, and his advice is usually pretty good (except that one time he advised that I jump into icy rapids).
So I guess this is the first post.
Then today I stumbled across David Hogg's research blog, and it has inspired me to keep a similar research blog. For one I am excited about my research and want to share this excitement. Plus I see some potential benefits in doing so: it will make me succinctly think about what I did that day, and it will keep an easily searchable record (I have often remembered a particular discussion or some work but couldn't recall the details). Finally, blogging about my research was suggested by my cousin-brother (Frank Rim) about a year ago, and his advice is usually pretty good (except that one time he advised that I jump into icy rapids).
So I guess this is the first post.
Subscribe to:
Comments (Atom)








