subrecipient risk assessment robert prentiss sr. grants and contracts specialist, the university of...

64
Subrecipient Risk Assessment Robert Prentiss Sr. Grants and Contracts Specialist, The University of Texas at Austin Susan Wyatt (Sedwick) Linehan Consulting Associate, Attain LLC Anita Mills Solutions Consultant, Evisions

Upload: herbert-cain

Post on 26-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Subrecipient Risk Assessment

Robert PrentissSr. Grants and Contracts Specialist,

The University of Texas at Austin

Susan Wyatt (Sedwick) LinehanConsulting Associate, Attain LLC

Anita MillsSolutions Consultant, Evisions

Why Subrecipient Monitoring!?!?!

Disallowed Cost

Federal Regulators Knocking on your door

Fine$ + Penaltie$

Administrators could be questioned and face criminal penalties!!!!

Damages – Money you have to pay back

More money you have to pay

Corrective Actions Spend more money to put systems in place to prevent the problems from happening in the future

Stewa

rdshi

p

Accountability Timeline

• 2006 2007

USASpending.com2009 2014 Omni

Circular

Digital Accountability and Transparency Act of 2014

2009 False Claims Act and Fraud Enforcement and Recovery Act (FERA)

FFATA Signed

2010 FFATAReportingARRA

Recent Finding

Criminal Charges

Fraud Happened

Institution Allegations

2014 2008-2009 Morgan State NSF Grant – Paid personal expenses and work never completed - Facing 20 year

2013 2006-2009 University of Maryland

P-card Fraud and paying Fake Companies

2011 1999-2009 Florida State University

Fraudulent Contracts on NSF grant

2006 2001-2002 UMass and Yale Overstate cost on subcontracts

UG Sub Risk Assessment Requirements

• §200.331 requires an evaluation of each sub’s risk of noncompliance – Prior experience – Results of previous audits including project

specific– New personnel or substantially revised systems– Existence of other Federal awards– Other considerations are allowed

§200.110 Effective/Applicability

• For awards with incremental funding under UG, subaward flow down will need to be addressed.

• Agencies have the option of considering incremental funding actions under the UG on previously issued awards to be opportunities to change award terms and conditions with the first increment issued on or after 12/26/14.

Subrecipient Risk Assessment

The FDP has been developing a questionnaire to assess subrecipient risk.

UT-Austin has been using draft versions of the questionnaire for over a year.

We’ve got enough data to start drawing some conclusions about how it works in practice.

The FDP RAQ

The Risk Assessment Questionnaire (RAQ) consists of fifteen unscored (yes/no) questions, and thirteen scored questions.

Of the scored questions, six are related to the institution, and seven are project-specific.

The FDP RAQ

There are some ambiguities in the questions, but the questionnaire is intended to be fairly straightforward and easy to complete.

The more difficult questions are: What does the score mean? What should be done with the score?

Institutional Questions

Maximum Score

1) Location 9

2) Type of organization 6

3) Negotiated IDC rate agreement 34) Audit results 6

5) Maturity 6

6) Conflict of interest experience 6

36

Project Questions

Maximum Score

1) Prime sponsor type 82) Prime award type 63) Subaward amount 3

4) Percent of prime award 9

5) Human or animal subjects 9

6) Scope of work and deliverables 6

7) Place of performance 6

47

Subaward A

Subrecipient is a large public university in Texas.

Prime award is a research grant from a large federal agency.

Subaward A

Institutional Questions Score

1) U.S. based institution 0

2) University 0

3) Negotiated IDC rate 0

4) Satisfactory A-133 audit 0

5) 10+ years subrecipient experience 0

6) Conflict of interest experience 0

0

Subaward A

Project Questions Score

1) Routine granting agency 0

2) Grant 0

3) Outgoing funds > $650,000 3

4) 25%-49% of prime award 3

5) No human/animal subjects 0

6) Responsible for tangible products 1

7) All work at subrecipient institution 0

7

Subaward A

Institutional Score 0

Project Score 7

Total Score 7

Subaward B

Subrecipient is a non-profit research organization.

Prime award is a cooperative agreement from a large federal agency.

Subaward B

Institutional Questions Score

1) U.S. based institution 0

2) Non-profit 4

3) Negotiated IDC rate 0

4) Satisfactory A-133 audit 0

5) 10+ years subrecipient experience 0

6) Conflict of interest experience 0

4

Subaward B

Project Questions Score

1) More stringent federal sponsor 4

2) Cooperative agreement 2

3) Outgoing funds > $650,000 3

4) 0-24% of prime award 0

5) No human/animal subjects 0

6)Continuation funding tied to sub performance 6

7)Some work at prime institution and in the field 4

19

Subaward B

Institutional Score 4

Project Score 19

Total Score 23

Subaward C

Subrecipient is a small company.Prime award is a contract from a

state agency.

Subaward C

Institutional Questions Score

1) U.S. based institution 0

2)Industry 6

3)No IDC requested 0

4)No audit 6

5) 5-9 years subrecipient experience 2

6) No conflict of interest experience 6

20

Subaward C

Project Questions Score

1)State sponsor 4

2) Contract 6

3) Outgoing funds $150,000 - $649,999 2

4) Over 50% of prime award 9

5) No human/animal subjects 0

6) Responsible for tangible products 1

7) All work at subrecipient institution 0

22

Subaward C

Institutional Score 20

Project Score 22

Total Score 42

Quantifying Risk

The FDP questionnaire was designed with the assumption that risk of subrecipient non-compliance is always multifaceted.

No single factor would cause a subaward to be considered high risk.

Quantifying Risk

Like most quantifications of complex ideas, the score can be accurate, but not precise.

Accuracy vs. Precision

An example: walkscore.com 2900 Guadalupe Street 4300 Avenue H

The Approach

Subawards will be grouped into pools of low, moderate, and high risk.

The Approach

Moderate and high risk subawards will receive an additional, comprehensive review at the pre-award stage. Non-quantifiable risks can be assessed and considered at this point.

High risk subawards will be the focus of our post-award desk review program.

Subaward scores at UT-Austin

AverageAverage

DeviationInstitutional Score 4 4

Project Score 8 5

Total Score 12 8

Number of subawards in each scoring range (total score)

Institutional Score Correlations

The maturity of the subrecipient correlates moderately with the type, indirect cost rate agreement status, and A-133 audit status.

Less mature subrecipients tend to:a) be in industryb) not have a negotiated IDC rate

agreementc) not have an A-133 audit

Project Score Correlations

Prime sponsor type correlates moderately with prime award type.

Federal sponsors tend to award grants, while industry sponsors are more likely to award contracts.

Correlations: a problem?

We have much more quantifiable information about the project then we do about the institution.

Domestic colleges and universities with clean A-133 audits almost always have an institutional score of zero.

Subaward scores at UT-Austin

AverageAverage

Deviation

Average + Average

Dev.Average + 2 x Ave. Dev.

Institutional Score 4 4 8 12

Project Score 8 5 13 18

Total Score 12 8 20 28

Subaward scores at UT-Austin

AverageAverage

Deviation

Average + Average

Dev.Average + 2 x Ave. Dev.

Institutional Score 4 4 8 12

Project Score 8 5 13 18

Total Score 12 8 20 28

Review of subawards A, B, and C

Subaward A Subaward B Subaward CInstitutional Score 0 4 20

Project Score 7 19 22

Total Score 7 23 42

Review of subawards A, B, and C

Subaward A Subaward B Subaward CInstitutional Score 0 4 20

Project Score 7 19 22

Total Score 7 23 42

The Plan

Moderate risk threshold will be set by the institutional and project scores.

Moderate risk begins when the score equals the average plus the average deviation.

High risk threshold will be set by the total score.

High risk begins when the score equals the average plus twice the average deviation.

Subaward scores at UT-Austin

AverageAverage

Deviation

Average + Average

Dev.Average + 2 x Ave. Dev.

Institutional Score 4 4 8 12

Project Score 8 5 13 18

Total Score 12 8 20 28

Risk Pools

Risk Pools

Institutional Risk

How much do scores vary over time?What will the institutional risk profile

look like in 5 years? 10 years?How much follow-up work will desk

reviews require?When will site visits be necessary?

Institutional Risk

Institutional Risk

In order to mitigate institutional risk, thresholds will be reset every six months, based on the scores of the previous twelve months.

Institutional RiskCurrent thresholds were set on January 1st:

                

 Moderate Risk

High Risk  

 Institutional Score 8  

 Project Score 13  

 Total Score 28                  

If they were reset today, they would be:

                

 Moderate Risk

High Risk  

 Institutional Score 8  

 Project Score 12  

 Total Score 27                  

Other Approaches

Three institutions had responses to UT-Austin’s methodology.

We’ll refer to them as institutions X, Y, and Z.

Institution X

“Have you lost your mind?”

Institution X

We’ll take a much more straightforward approach.

We’ll use a shorter, simplified version of the FDP questionnaire.

We’ll set fixed thresholds based on our collective judgment and experience.

Institution Y

The total score should set the moderate risk threshold.

The institutional and project scores should set the high risk threshold.

Institution Z

There should only be two risk pools: low and high.

Total score will be the sole determinant.

Additional questions will be added to better assess the risk of subawards issued from state prime awards.

Institution Z

Statistician: Nothing about the FDP questionnaire

or UT-Austin’s methodology is scientific.

UT-Austin: We agree! It couldn’t possibly be at

this point. But that doesn’t mean it’s not valuable.

FDP/COGR

From the June 2014 White Paper on the Uniform Guidance: A major issue of interest to

Universities is the unnecessary and unproductive duplication of audit reviews and a pass-through entity’s inability to rely on auditor and federal agency management decisions for entities already subject to the A-133 process.

The Problem

No individual institution looks forward to a desk review that reveals significant problems with a subrecipient.

But it’s in our collective interest to avoid unnecessary desk reviews of compliant institutions.

The Solution

Therefore our tools in assessing risk should be sharp, so that they may be used sparingly.

The only way to improve them is to collect and analyze data about how our subrecipient risk assessments correspond with the results of our desk reviews.

Questions

How will your institution be conducting subrecipient risk assessments?

What are your thoughts on the FDP questionnaire and UT-Austin’s implementation of it?