I  General Plan for Cycle 5 Proposals
- -------------------------------------

Important changes
- --------------------
RPS now has preferences
RPS now allows logarithmic monitoring intervals
RPS will check target coordinates upon submission
Proposals will be sent to reviewers as hardcopy and on CD
Large Projects will be evaluated by 2 panels

GTO proposals
- -------------
  Will list Observer as PI.
  Will have cover classifying as GO for peer review.
  Panels will first rank all proposals, then GTO proposals 
    will be identified.

GO Proposals
- ------------
Number of proposals expected:

Type	cyc 4	cyc 5 estimate
- -----	------	--------------
GO/TOO	703	700
GTO	 10	 10
LP	 47	 40
VLP	 -	 20	
Arch	 39	 40
Theory	 28	 30
- ------	-----	-----
total	827	840


Number of Reviewers and Workload
- --------------------------------
If we require that each LP and VLP be evaluated by 2 panels,
the number of proposals to be read is 840 + 60 = 900.

There will be 12,300 ks available for GO/TOO proposals.

Plan on 12-15 panels, 8 reviewers/panel.

12 panels	75 proposals/panel including 10 LP/VLP,
		1025 ks time allotment, 96 reviewers.

15 panels	60 proposals/panel including 8 LP/VLP
		820 ks time allotment, 120 reviewers.

		

CXC Technical Review
- --------------------
	TOO proposals by MP
	Other constrained proposals by MP
	Proposals for bright sources by SOT
	LETG/HETG proposals by SOT/Cal

Pre-review Grades
- -----------------

Make sure proposals to panelists 4 weeks before review.

Require preliminary grades 5 days before review.
CXC will call reviewers who are late with grades.
CXC make rank-ordered list for each panel which delineates
(e.g.) top 3-4, bottom 30-40.
CXC will recommend  that the panels not spend much time with these.  

(Write reports for bottom half at end of first day.  Use primary
and secondary comments.  CXC already formats.  Invite one panelist
to come a day early and write reports for lowest ranked before
review?)


II  Limit Chandra projects for successful proposers?? 
- -----------------------------------------------------

For discussion, not for cycle 5
- --------------  ---
Only purpose is to increase number of successful PIs in each review.


In cycle 4, there were 201 GO observing proposals accepted
  (239 -9 th -20 arc -9 GTO = 201)

The number of GO PIs was 169
  (197 -8 th -14 arc -6 GTO = 169)

observing time per proposal was 16000/201 = 80 ks
observing time per PI was 16000/169 = 95 ks

So, for each 1000 ks project, there are 10 fewer PIs participating.
For each 300  ks project, there are 3 fewer PIs participating.

Limit winners of LP and VLP to one observing project?
If this had been done in Cycle 4, ~10 more small proposals
would have been possible.


III  A History of Fair-share Costs
- ----------------------------------

C = fair-share cost
T = total observing time
N = number of targets
J = difficulty factor; 
        = 0.9 if analysis rated `easy' by the peer review
        = 1.0 if analysis rated `average' by the peer review
        = 1.2 if analysis rated `difficult' by the peer review

Cap of 250 applied to any fair-shares calculated to be over 250.
Proposers and panels told that costs above fair-share are OK if
justified.  Fair-shares normalized so sum adds to 0.96 of funds
available, the remaining 4% is reserved for over-fair-share approvals.


Cycle 1: There was no fair-share, funds available equaled money requested
- -------
Fair-share formula generated recognizing that everyone needed effort
to set up and learn the software and that longer observations
generally were more time-consuming to analyze than short ones.

Cycle 2:  C = (30 + 0.45T)J
- -------
Cost-review panels were asked for suggestions for calculating
fair-share costs.  Several suggestions received and formula adopted in
which cost proportional to sq root of time and forth root of number of
targets.

Cycle 3:  C = 5.7(N^^1/4)(T^^1/2)J
- -------
Two users complained that the funds available for simple projects were
inadequate so constant term reinstated.

Cycle 4:  C = 10 + 4.6(N^^1/4)(T^^1/2)J
- -------
No complaints yet but results just released.


Sample Fair-share Costs (in k$, average difficulty)
- ---------------------------------------------------

time    targets cyc 2   cyc 3   cyc 4
- ------- ------- ------- ------- -------
10      1       34.5    18      24.5

100     1       75      57      56
100     3       75      75      70.5
100     10      75      101.3   81.8

300     1       165     98.7    89.7
300     3       165     129.9   104.8
300     10      165     175.5   141.7