Cycle 26 Peer Review Update
Rodolfo Montez Jr.
The Chandra Peer Review has changed significantly over twenty-five years in response to updates in technology and evolving best practices, but it has always maintained at its core a reliance on the efforts of the community and a focus on maximizing the scientific value of the observing plan. What started as a process that relied on mailed-in paper proposals and reports saved on floppy disks now depends on a host of modern software to manage a fully remote dual-anonymous review.
Once again, 99 reviewers volunteered their time and expertise this summer to discuss the 406 proposals submitted to Chandra's 26th call for proposals. Proposals were discussed in eleven topical panels, two Target Of Opportunity (TOO) panels (see last Issue's article for more details), and a Big Project Panel. However, this year all of the events took place from late May through the end of June—instead of the nominal two week period.
Embracing Asynchronous Panels
In 2020, we were forced to switch from our usual in-person review format due to the onset of the COVID-19 pandemic; instead, we embraced online tools—Zoom, Slack, and Google Drive—to enable a fully-remote review. For a variety of reasons, especially the costs, Chandra Peer Review has remained fully-remote, even after other in-person events have resumed. There have certainly been drawbacks to not meeting face-to-face, but the format does allow for more flexibility for participants, who can now participate in the review without the complexity of multi-day travel.
This year, we extended that flexibility even further. One of the major impediments to the recruitment of scientists serving on panels is that June, when we traditionally hold our peer review, is also an active season for conferences and vacations. A single high-impact topical meeting could potentially deprive the review of dozens of qualified reviewers. If the topic of the conference were to overlap with a focus of one of our panels, it could dramatically hamper recruitment. Prior to 2020, we were required to have all panels meet at the same time in order to work within the confines of having the Review at a single venue. With online panels, that restriction is no longer present, and so this year we conducted our first review with asynchronous panels.
Panels 3 and 4 met the week of May 27, Panels 1 and 2 the week of June 3, and Panels 5–13 the week of June 17, followed by the Big Project Panel. In addition to allowing flexible scheduling to accommodate panelists' schedules, this format also meant that CXC scientists and staff were better able to provide support as needed. And, with only a maximum of nine panels running concurrently at any time, we were able to optimize staffing and support of all the panels. These factors may seem minor, but with the relatively small CXC support staff, they improved the efficiency of panel operations while also reducing the overall demands on staffing.
This experiment went well, and, with many of the first-run issues worked out, we are currently planning on continuing and improving asynchronous panels in the future.
Cycle 26 Proposal Statistics
Full proposal statistics from Cycle 26 will be released along with the results of the Peer Review on the CXC website. Results will be released at a later date, when there is further clarity about the financial and operational situation of the mission.