Table of contents
  1. Story
    1. An NSF for Data Scientists, Data Infrastructure, and Data Publications
    2. NSF Directorate for Geosciences Data Policies
    3. NSF Publications
      1. Web Site
      2. Spreadsheet Matrix of Publications
      3. Spreadsheet Table Format
    4. NSF Data Tables in a Data Browser
    5. Ontology and Ontologizing
      1. Why you need an Ontology
      2. What is an Ontology made of?
      3. An Ontology links Formal meaning in computers with Human understanding
      4. An Ontology help us assemble data from diverse sources into a meaningful whole
      5. Creating a Useful Ontology and Also How to Fail
        1. Scope: bigger isn’t necessarily better
        2. Ontologizing data is not standardizing health care
        3. An Ontology is only as valuable as it’s applications
      6. In Summary
  2. Slides
    1. Slide 1 NSF Funding & Research Community
    2. Slide 2 NSF Strategic Plan Description
    3. Slide 3 NSF Strategic Plan Available Formats
    4. Slide 4 NSF Strategic Plan PDF
    5. Slide 5 NSF Publications Listing
    6. Slide 6 NSF Strategic Plan Spreadsheet Knowledge Base
    7. Slide 7 NSF Strategic Plan Wiki Knowledge Base
    8. Slide 8 NSF Strategic Plan Spotfire Cover Page
    9. Slide 9 NSF Publications
    10. Slide 10 NSF Strategic Plan Tables
    11. Slide 11 NSF GEO Tables
    12. Slide 12 NSF Performance Plan Tables
  3. Spotfire Dashboard
  4. Research Notes
  5. National Science Foundation Strategic Plan for 2014 – 2018
    1. Summary
    2. Mission
    3. NSF Vision
    4. NSF Core Values
    5. Strategic Planning In a Dynamic, Global Context
    6. Stakeholder Engagement
    7. Strategic Goals
      1. Strategic Goal 1: Transform the Frontiers of Science and Engineering
      2. Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
      3. Strategic Goal 3: Excel as a Federal Science Agency
    8. Strategic Objectives
      1. Strategic Goal 1 (G1): Transform the Frontiers of Science and Engineering
        1. Strategic Objective 1 (G1/O1)
        2. Strategic Objective 2 (G1/02)
        3. Strategic Objective 3 (G1/O3)
      2. Strategic Goal 2 (G2) Stimulate Innovation and Address Societal Needs through Research and Education
        1. Strategic Objective 1 (G2/O1)
        2. Strategic Objective 2 (G2/02)
      3. Strategic Goal 3 (G3): Excel as a Federal Science Agency
        1. Strategic Objective 1 (G3/O1)
        2. Strategic Objective 2 (G3/O2)
    9. Agency Priority Goals (APG) and Long-term Performance Goals
      1. Priority Goal 1: Increase public access to NSF-funded peer-reviewed publications
      2. Priority Goal 2: Improve the nation’s capacity in data science by investing in the development of human capital and infrastructure
    10. Appendix I -Strategies and Means
      1. Introduction
        1. Core Strategies
        2. Means for Carrying Out Core Strategies
      2. Measuring Performance
        1. Core Approaches for Measuring Performance
        2. New Approaches to Evaluation and Assessment
    11. Appendix II -Contributing Programs for Strategic Objectives
      1. List of NSF Strategic Goals and Objectives 2014-2018
      2. NSF Priority Goals
    12. List of Programs
      1. Biological Sciences (BIO)
      2. Computer and Information Science and Engineering (CISE)
      3. Engineering (ENG)
      4. Geosciences (GEO)
        1. DIRECTORATE FOR GEOSCIENCES (GEO)
          1. Table GEO Funding (Dollars in Millions)​
          2. About GEO
          3. Figure GEO Subactivity Funding (Dollars in Millions)
          4. FY 2014 Summary by Division
          5. Major Investments
          6. Table GEO Major Investments (Dollars in Millions)
          7. GEO Funding for Centers Programs and Facilities
          8. Table GEO Funding for Centers Programs (Dollars in Millions)
          9. Table GEO Funding for Facilities (Dollars in Millions)
          10. Summary and Funding Profile
          11. Table GEO Funding Profile
          12. Program Monitoring and Evaluation
          13. Table Number of People Involved in GEO Activities
        2. DIVISION OF ATMOSPHERIC AND GEOSPACE SCIENCES (AGS)
          1. Table AGS Funding(Dollars in Millions)​
          2. FY 2014 Summary
        3. DIVISION OF EARTH SCIENCES (EAR)
          1. Table EAR Funding(Dollars in Millions)
          2. FY 2014 Summary
        4. DIVISION OF INTEGRATIVE AND COLLABORATIVE EDUCATION AND RESEARCH (ICER)
          1. Table ICER Funding(Dollars in Millions)
          2. FY 2014 Summary
        5. DIVISION OF OCEAN SCIENCES (OCE)
          1. Table​ OCE Funding (Dollars in Millions)
          2. FY 2014 Summary
        6. DIVISION OF POLAR PROGRAMS (PLR)
          1. Table​ PLR Funding (Dollars in Millions)
          2. FY 2014 Summary
        7. References
          1. GEO 1
          2. GEO 2
          3. GEO 3
          4. GEO 4
          5. GEO 5
          6. GEO 6
          7. GEO 7
          8. GEO 8
          9. GEO 9
          10. GEO 10
          11. GEO 11
          12. GEO 12
          13. GEO 13
          14. GEO 14
      5. Mathematical and Physical Sciences (MPS)
      6. Social, Behavioral and Economic Sciences (SBE)
      7. International and Integrative Activities (IIA)
      8. United States Arctic Research Commission (USARC)
      9. Education and Human Resources (EHR)
      10. Major Research Equipment and Facilities Construction (MREFC)
      11. Agency Operations and Award Management (AOAM)
      12. Office of the National Science Board (NSB)
      13. Office of Inspector General (OIG)
  6. FY 2013 NSF Budget Request to Congress Performance
    1. NSF Performance Context
      1. Mission Statement
      2. Strategic Plan and Performance Goals
        1. Table 1
      3. GPRA Modernization Act of 2010
    2. Agency Priority Goals
      1. FY 2012-FY 2013 Agency Priority Goals
        1. FY 2012-FY 2013 NSF Agency Priority Goals
          1. Table 2
      2. FY 2012-FY 2013 Priority Goal: Access to Digital Products of NSF-Funded Research
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Relevant Strategic Objective/Performance Goal
        5. Description
        6. Contributing Programs
      3. FY 2012-FY 2013 Priority Goal: Undergraduate Programs
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Relevant Strategic Objective/Performance Goal
        5. Description
        6. Contributing Programs
      4. FY 2012-FY 2013 Priority Goal: NSF Innovation Corps
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Objective/Performance Goal
        5. Description
        6. Contributing Programs
    3. FY 2010-FY 2011 Priority Goal Report
      1. FY 2010-FY2011 Priority Goal: Science, Technology, Engineering and Mathematics (STEM) Workforce Development
        1. Goal Statement
        2. Lead Organization
        3. Relevant Strategic Objective/Performance Goal
        4. Result: Achieved. Twelve programs reached the target
          1. Table 3
        5. Lessons Learned and Future Directions
    4. FY 2012 – FY 2013 Annual Performance Plan
      1. Introduction
        1. Table 4
      2. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-1
        2. Strategic Target
        3. Goal T-1.1 INSPIRE (Integrated NSF Support Promoting Interdisciplinary Research and Education)
          1. Table 5
      3. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Goal T-2.1 PRIORITY GOAL: Undergraduate Programs
          1. Table 6
      4. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Goal T-2.2 Career-Life Balance
          1. Table 7
      5. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-3
        2. Strategic Target
        3. Goal T-3.1 International Implications
          1. Table 8
      6. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Goal T-4.1 Construction Project Monitoring
          1. Table 9
        4. Construction Project Monitoring performance trends, FY 2006-FY 2011
          1. Figure 1
      7. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Goal T-4.2 PRIORITY GOAL: Access to Digital Products of NSF-Funded Research
          1. Table 10
      8. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Goal I-1.1 PRIORITY GOAL: Innovation Corps
          1. Table 11
      9. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Goal I-1.2 Industrial and Innovation Partnerships
          1. Table 12
      10. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Goal I-2.1 Public Understanding and Communication of Science and Engineering
          1. Table 13
      11. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Goal I-2.2 K-12 Components
          1. Table 14
      12. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Goal I-3.1 Innovative Learning Systems
          1. Table 15
      13. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.1 Model EEO Agency
          1. Table 16
      14. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.2 IPA Performance Plans
          1. Table 17
      15. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.3 Performance Management System
          1. Table 18
      16. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-2
        2. Strategic Target
        3. Goal M-2.1 Assess Developmental Needs
          1. Table 19
      17. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.1 Financial System Modernization
          1. Table 20
      18. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.2 Time-to-Decision
          1. Table 21
        4. Time to decision performance trends, FY 2006-FY 2011
          1. Figure 2
      19. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.3 Virtual Merit Review Panels
          1. Table 22
      20. Other Goals, Priorities, and Activities
        1. Changes from FY 2011
        2. Table 23
        3. Lower-Priority Programs
        4. Burden Reduction/Unnecessary Plans and Reports to Congress
        5. Management Challenges
    5. FY 2011 Annual Performance Report
      1. Table 24
      2. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-1
        2. Strategic Target
        3. Lead Organization
        4. Goal T-1.1 Potentially Transformative Research (PTR)
          1. Table 25
        5. Discussion
      3. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Lead Organization
        4. Goal T-2.1 STEM Workforce Priority Goal
          1. Table 26
        5. Discussion
      4. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-3
        2. Strategic Target
        3. Lead Organization
        4. Goal T-3.1 International Implications
          1. Table 27
        5. Discussion
      5. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Lead Organization
        4. Goal T-4.1 Construction Project Monitoring
          1. Table 28
        5. Construction Project Monitoring Performance Trends, FY 2006-FY 2011
          1. Figure 3
      6. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Lead Organization
        4. Goal T-4.2 Data Management Practices at Large Facilities
          1. Table 29
      7. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Lead Organization
        4. Goal I-1.1 IIP Grantees’ Partnerships
          1. Table 30
          2. Table 31
      8. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Lead Organization
        4. Goal I-2.1 Public Understanding and Communication
          1. Table 32
      9. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Lead Organization
        4. Goal I-2.2 K-12 Components
          1. Table 33
      10. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Lead Organization
        4. Goal I-3.1 Innovative Learning Systems
          1. Table 34
      11. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Lead Organization
        4. Goal I-3.2 Partnerships for Learning Technologies
          1. Table 35
      12. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.1 Intergovernmental Personnel Agreement (IPA) Performance Plans
          1. Table 36
      13. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.2 360 Degree Evaluation Instrument
          1. Table 37
        5. Information on delayed achievement
      14. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.3 Model EEO Agency
          1. Table 38
      15. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Goal 3
        2. Strategic Objective/Performance Goal M-2
        3. Strategic Target
        4. Lead Organization
        5. Goal M-2.1 Assess Developmental Needs
          1. Table 39
        6. Information on delayed achievement
      16. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Lead Organization
        4. Goal M-3.1 Grant-By-Grant Payments
          1. Table 40
        5. Information on delayed achievement
      17. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Lead Organization
        4. Goal M-3.2 Time to Decision
          1. Table 41
        5. Time to Decision Performance Trends, FY 2006-FY 2011
          1. Figure 4
    6. FY 2011 External Program Evaluations List
      1. External Evaluations Conducted by Contractors
      2. DIRECTORATE FOR ENGINEERING
        1. Division of Engineering Education and Centers (EEC)
          1. Table 42
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Recommendations
          6. Agency response to recommendations
          7. Publications
          8. Actual Cost
      3. DIRECTORATE FOR ENGINEERING
        1. Division of Engineering Education and Centers (EEC)
          1. Table 43
          2. Program Description
          3. Recommendations
          4. Agency response to recommendations
          5. Publications
          6. Actual Cost
      4. DIRECTORATE FOR ENGINEERING
        1. Emerging Frontiers in Research and Innovation (EFRI) Office
          1. Table 44
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      5. DIRECTORATE FOR ENGINEERING
        1. Division of Civil, Mechanical and Manufacturing Innovation (CMMI)
          1. Table 45
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Recommendations
          6. Agency response to recommendations
          7. Publications
          8. Actual Cost
      6. DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES
        1. Division of Mathematical Sciences (DMS)
          1. Table 46
          2. Program Description
          3. Evaluation Description
          4. Findings and Recommendations
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      7. DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES
        1. Division of Chemistry (CHE)
          1. Table 47
          2. Program Description
          3. Evaluation Description
          4. Findings and Recommendations
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      8. OFFICE OF POLAR PROGRAMS
        1. Table 48
        2. Program Description
        3. Evaluation Description
        4. Findings
        5. Recommendations
        6. Agency response to recommendations
        7. Publications
        8. Actual Cost
      9. Committees of Visitors (COVs)
    7. Strategic Plan Addendum
      1. FY 2012-FY 2013 Agency Priority Goals
        1. Access to Digital Products of NSF-Funded Research
          1. Table 49
        2. Undergraduate Programs
        3. Innovation Corps
      2. FY 2012-FY 2013 Federal Priority Goals
      3. Congressional Consultations
    8. Supporting Information
      1. Annual Performance Plan
        1. Strategies and Supporting Analysis
        2. Strategic Human Capital Management at NSF
      2. Annual Performance Report
        1. Verification and Validation of Performance Goals
        2. Data Sources, Limitations, and Intended Use
    9. FY 2010 – FY 2011 Priority Goal Report

NSF Strategic Plan

Last modified
Table of contents
  1. Story
    1. An NSF for Data Scientists, Data Infrastructure, and Data Publications
    2. NSF Directorate for Geosciences Data Policies
    3. NSF Publications
      1. Web Site
      2. Spreadsheet Matrix of Publications
      3. Spreadsheet Table Format
    4. NSF Data Tables in a Data Browser
    5. Ontology and Ontologizing
      1. Why you need an Ontology
      2. What is an Ontology made of?
      3. An Ontology links Formal meaning in computers with Human understanding
      4. An Ontology help us assemble data from diverse sources into a meaningful whole
      5. Creating a Useful Ontology and Also How to Fail
        1. Scope: bigger isn’t necessarily better
        2. Ontologizing data is not standardizing health care
        3. An Ontology is only as valuable as it’s applications
      6. In Summary
  2. Slides
    1. Slide 1 NSF Funding & Research Community
    2. Slide 2 NSF Strategic Plan Description
    3. Slide 3 NSF Strategic Plan Available Formats
    4. Slide 4 NSF Strategic Plan PDF
    5. Slide 5 NSF Publications Listing
    6. Slide 6 NSF Strategic Plan Spreadsheet Knowledge Base
    7. Slide 7 NSF Strategic Plan Wiki Knowledge Base
    8. Slide 8 NSF Strategic Plan Spotfire Cover Page
    9. Slide 9 NSF Publications
    10. Slide 10 NSF Strategic Plan Tables
    11. Slide 11 NSF GEO Tables
    12. Slide 12 NSF Performance Plan Tables
  3. Spotfire Dashboard
  4. Research Notes
  5. National Science Foundation Strategic Plan for 2014 – 2018
    1. Summary
    2. Mission
    3. NSF Vision
    4. NSF Core Values
    5. Strategic Planning In a Dynamic, Global Context
    6. Stakeholder Engagement
    7. Strategic Goals
      1. Strategic Goal 1: Transform the Frontiers of Science and Engineering
      2. Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
      3. Strategic Goal 3: Excel as a Federal Science Agency
    8. Strategic Objectives
      1. Strategic Goal 1 (G1): Transform the Frontiers of Science and Engineering
        1. Strategic Objective 1 (G1/O1)
        2. Strategic Objective 2 (G1/02)
        3. Strategic Objective 3 (G1/O3)
      2. Strategic Goal 2 (G2) Stimulate Innovation and Address Societal Needs through Research and Education
        1. Strategic Objective 1 (G2/O1)
        2. Strategic Objective 2 (G2/02)
      3. Strategic Goal 3 (G3): Excel as a Federal Science Agency
        1. Strategic Objective 1 (G3/O1)
        2. Strategic Objective 2 (G3/O2)
    9. Agency Priority Goals (APG) and Long-term Performance Goals
      1. Priority Goal 1: Increase public access to NSF-funded peer-reviewed publications
      2. Priority Goal 2: Improve the nation’s capacity in data science by investing in the development of human capital and infrastructure
    10. Appendix I -Strategies and Means
      1. Introduction
        1. Core Strategies
        2. Means for Carrying Out Core Strategies
      2. Measuring Performance
        1. Core Approaches for Measuring Performance
        2. New Approaches to Evaluation and Assessment
    11. Appendix II -Contributing Programs for Strategic Objectives
      1. List of NSF Strategic Goals and Objectives 2014-2018
      2. NSF Priority Goals
    12. List of Programs
      1. Biological Sciences (BIO)
      2. Computer and Information Science and Engineering (CISE)
      3. Engineering (ENG)
      4. Geosciences (GEO)
        1. DIRECTORATE FOR GEOSCIENCES (GEO)
          1. Table GEO Funding (Dollars in Millions)​
          2. About GEO
          3. Figure GEO Subactivity Funding (Dollars in Millions)
          4. FY 2014 Summary by Division
          5. Major Investments
          6. Table GEO Major Investments (Dollars in Millions)
          7. GEO Funding for Centers Programs and Facilities
          8. Table GEO Funding for Centers Programs (Dollars in Millions)
          9. Table GEO Funding for Facilities (Dollars in Millions)
          10. Summary and Funding Profile
          11. Table GEO Funding Profile
          12. Program Monitoring and Evaluation
          13. Table Number of People Involved in GEO Activities
        2. DIVISION OF ATMOSPHERIC AND GEOSPACE SCIENCES (AGS)
          1. Table AGS Funding(Dollars in Millions)​
          2. FY 2014 Summary
        3. DIVISION OF EARTH SCIENCES (EAR)
          1. Table EAR Funding(Dollars in Millions)
          2. FY 2014 Summary
        4. DIVISION OF INTEGRATIVE AND COLLABORATIVE EDUCATION AND RESEARCH (ICER)
          1. Table ICER Funding(Dollars in Millions)
          2. FY 2014 Summary
        5. DIVISION OF OCEAN SCIENCES (OCE)
          1. Table​ OCE Funding (Dollars in Millions)
          2. FY 2014 Summary
        6. DIVISION OF POLAR PROGRAMS (PLR)
          1. Table​ PLR Funding (Dollars in Millions)
          2. FY 2014 Summary
        7. References
          1. GEO 1
          2. GEO 2
          3. GEO 3
          4. GEO 4
          5. GEO 5
          6. GEO 6
          7. GEO 7
          8. GEO 8
          9. GEO 9
          10. GEO 10
          11. GEO 11
          12. GEO 12
          13. GEO 13
          14. GEO 14
      5. Mathematical and Physical Sciences (MPS)
      6. Social, Behavioral and Economic Sciences (SBE)
      7. International and Integrative Activities (IIA)
      8. United States Arctic Research Commission (USARC)
      9. Education and Human Resources (EHR)
      10. Major Research Equipment and Facilities Construction (MREFC)
      11. Agency Operations and Award Management (AOAM)
      12. Office of the National Science Board (NSB)
      13. Office of Inspector General (OIG)
  6. FY 2013 NSF Budget Request to Congress Performance
    1. NSF Performance Context
      1. Mission Statement
      2. Strategic Plan and Performance Goals
        1. Table 1
      3. GPRA Modernization Act of 2010
    2. Agency Priority Goals
      1. FY 2012-FY 2013 Agency Priority Goals
        1. FY 2012-FY 2013 NSF Agency Priority Goals
          1. Table 2
      2. FY 2012-FY 2013 Priority Goal: Access to Digital Products of NSF-Funded Research
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Relevant Strategic Objective/Performance Goal
        5. Description
        6. Contributing Programs
      3. FY 2012-FY 2013 Priority Goal: Undergraduate Programs
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Relevant Strategic Objective/Performance Goal
        5. Description
        6. Contributing Programs
      4. FY 2012-FY 2013 Priority Goal: NSF Innovation Corps
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Objective/Performance Goal
        5. Description
        6. Contributing Programs
    3. FY 2010-FY 2011 Priority Goal Report
      1. FY 2010-FY2011 Priority Goal: Science, Technology, Engineering and Mathematics (STEM) Workforce Development
        1. Goal Statement
        2. Lead Organization
        3. Relevant Strategic Objective/Performance Goal
        4. Result: Achieved. Twelve programs reached the target
          1. Table 3
        5. Lessons Learned and Future Directions
    4. FY 2012 – FY 2013 Annual Performance Plan
      1. Introduction
        1. Table 4
      2. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-1
        2. Strategic Target
        3. Goal T-1.1 INSPIRE (Integrated NSF Support Promoting Interdisciplinary Research and Education)
          1. Table 5
      3. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Goal T-2.1 PRIORITY GOAL: Undergraduate Programs
          1. Table 6
      4. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Goal T-2.2 Career-Life Balance
          1. Table 7
      5. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-3
        2. Strategic Target
        3. Goal T-3.1 International Implications
          1. Table 8
      6. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Goal T-4.1 Construction Project Monitoring
          1. Table 9
        4. Construction Project Monitoring performance trends, FY 2006-FY 2011
          1. Figure 1
      7. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Goal T-4.2 PRIORITY GOAL: Access to Digital Products of NSF-Funded Research
          1. Table 10
      8. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Goal I-1.1 PRIORITY GOAL: Innovation Corps
          1. Table 11
      9. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Goal I-1.2 Industrial and Innovation Partnerships
          1. Table 12
      10. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Goal I-2.1 Public Understanding and Communication of Science and Engineering
          1. Table 13
      11. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Goal I-2.2 K-12 Components
          1. Table 14
      12. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Goal I-3.1 Innovative Learning Systems
          1. Table 15
      13. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.1 Model EEO Agency
          1. Table 16
      14. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.2 IPA Performance Plans
          1. Table 17
      15. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.3 Performance Management System
          1. Table 18
      16. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-2
        2. Strategic Target
        3. Goal M-2.1 Assess Developmental Needs
          1. Table 19
      17. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.1 Financial System Modernization
          1. Table 20
      18. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.2 Time-to-Decision
          1. Table 21
        4. Time to decision performance trends, FY 2006-FY 2011
          1. Figure 2
      19. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.3 Virtual Merit Review Panels
          1. Table 22
      20. Other Goals, Priorities, and Activities
        1. Changes from FY 2011
        2. Table 23
        3. Lower-Priority Programs
        4. Burden Reduction/Unnecessary Plans and Reports to Congress
        5. Management Challenges
    5. FY 2011 Annual Performance Report
      1. Table 24
      2. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-1
        2. Strategic Target
        3. Lead Organization
        4. Goal T-1.1 Potentially Transformative Research (PTR)
          1. Table 25
        5. Discussion
      3. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Lead Organization
        4. Goal T-2.1 STEM Workforce Priority Goal
          1. Table 26
        5. Discussion
      4. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-3
        2. Strategic Target
        3. Lead Organization
        4. Goal T-3.1 International Implications
          1. Table 27
        5. Discussion
      5. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Lead Organization
        4. Goal T-4.1 Construction Project Monitoring
          1. Table 28
        5. Construction Project Monitoring Performance Trends, FY 2006-FY 2011
          1. Figure 3
      6. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Lead Organization
        4. Goal T-4.2 Data Management Practices at Large Facilities
          1. Table 29
      7. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Lead Organization
        4. Goal I-1.1 IIP Grantees’ Partnerships
          1. Table 30
          2. Table 31
      8. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Lead Organization
        4. Goal I-2.1 Public Understanding and Communication
          1. Table 32
      9. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Lead Organization
        4. Goal I-2.2 K-12 Components
          1. Table 33
      10. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Lead Organization
        4. Goal I-3.1 Innovative Learning Systems
          1. Table 34
      11. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Lead Organization
        4. Goal I-3.2 Partnerships for Learning Technologies
          1. Table 35
      12. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.1 Intergovernmental Personnel Agreement (IPA) Performance Plans
          1. Table 36
      13. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.2 360 Degree Evaluation Instrument
          1. Table 37
        5. Information on delayed achievement
      14. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.3 Model EEO Agency
          1. Table 38
      15. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Goal 3
        2. Strategic Objective/Performance Goal M-2
        3. Strategic Target
        4. Lead Organization
        5. Goal M-2.1 Assess Developmental Needs
          1. Table 39
        6. Information on delayed achievement
      16. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Lead Organization
        4. Goal M-3.1 Grant-By-Grant Payments
          1. Table 40
        5. Information on delayed achievement
      17. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Lead Organization
        4. Goal M-3.2 Time to Decision
          1. Table 41
        5. Time to Decision Performance Trends, FY 2006-FY 2011
          1. Figure 4
    6. FY 2011 External Program Evaluations List
      1. External Evaluations Conducted by Contractors
      2. DIRECTORATE FOR ENGINEERING
        1. Division of Engineering Education and Centers (EEC)
          1. Table 42
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Recommendations
          6. Agency response to recommendations
          7. Publications
          8. Actual Cost
      3. DIRECTORATE FOR ENGINEERING
        1. Division of Engineering Education and Centers (EEC)
          1. Table 43
          2. Program Description
          3. Recommendations
          4. Agency response to recommendations
          5. Publications
          6. Actual Cost
      4. DIRECTORATE FOR ENGINEERING
        1. Emerging Frontiers in Research and Innovation (EFRI) Office
          1. Table 44
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      5. DIRECTORATE FOR ENGINEERING
        1. Division of Civil, Mechanical and Manufacturing Innovation (CMMI)
          1. Table 45
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Recommendations
          6. Agency response to recommendations
          7. Publications
          8. Actual Cost
      6. DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES
        1. Division of Mathematical Sciences (DMS)
          1. Table 46
          2. Program Description
          3. Evaluation Description
          4. Findings and Recommendations
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      7. DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES
        1. Division of Chemistry (CHE)
          1. Table 47
          2. Program Description
          3. Evaluation Description
          4. Findings and Recommendations
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      8. OFFICE OF POLAR PROGRAMS
        1. Table 48
        2. Program Description
        3. Evaluation Description
        4. Findings
        5. Recommendations
        6. Agency response to recommendations
        7. Publications
        8. Actual Cost
      9. Committees of Visitors (COVs)
    7. Strategic Plan Addendum
      1. FY 2012-FY 2013 Agency Priority Goals
        1. Access to Digital Products of NSF-Funded Research
          1. Table 49
        2. Undergraduate Programs
        3. Innovation Corps
      2. FY 2012-FY 2013 Federal Priority Goals
      3. Congressional Consultations
    8. Supporting Information
      1. Annual Performance Plan
        1. Strategies and Supporting Analysis
        2. Strategic Human Capital Management at NSF
      2. Annual Performance Report
        1. Verification and Validation of Performance Goals
        2. Data Sources, Limitations, and Intended Use
    9. FY 2010 – FY 2011 Priority Goal Report

  1. Story
    1. An NSF for Data Scientists, Data Infrastructure, and Data Publications
    2. NSF Directorate for Geosciences Data Policies
    3. NSF Publications
      1. Web Site
      2. Spreadsheet Matrix of Publications
      3. Spreadsheet Table Format
    4. NSF Data Tables in a Data Browser
    5. Ontology and Ontologizing
      1. Why you need an Ontology
      2. What is an Ontology made of?
      3. An Ontology links Formal meaning in computers with Human understanding
      4. An Ontology help us assemble data from diverse sources into a meaningful whole
      5. Creating a Useful Ontology and Also How to Fail
        1. Scope: bigger isn’t necessarily better
        2. Ontologizing data is not standardizing health care
        3. An Ontology is only as valuable as it’s applications
      6. In Summary
  2. Slides
    1. Slide 1 NSF Funding & Research Community
    2. Slide 2 NSF Strategic Plan Description
    3. Slide 3 NSF Strategic Plan Available Formats
    4. Slide 4 NSF Strategic Plan PDF
    5. Slide 5 NSF Publications Listing
    6. Slide 6 NSF Strategic Plan Spreadsheet Knowledge Base
    7. Slide 7 NSF Strategic Plan Wiki Knowledge Base
    8. Slide 8 NSF Strategic Plan Spotfire Cover Page
    9. Slide 9 NSF Publications
    10. Slide 10 NSF Strategic Plan Tables
    11. Slide 11 NSF GEO Tables
    12. Slide 12 NSF Performance Plan Tables
  3. Spotfire Dashboard
  4. Research Notes
  5. National Science Foundation Strategic Plan for 2014 – 2018
    1. Summary
    2. Mission
    3. NSF Vision
    4. NSF Core Values
    5. Strategic Planning In a Dynamic, Global Context
    6. Stakeholder Engagement
    7. Strategic Goals
      1. Strategic Goal 1: Transform the Frontiers of Science and Engineering
      2. Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
      3. Strategic Goal 3: Excel as a Federal Science Agency
    8. Strategic Objectives
      1. Strategic Goal 1 (G1): Transform the Frontiers of Science and Engineering
        1. Strategic Objective 1 (G1/O1)
        2. Strategic Objective 2 (G1/02)
        3. Strategic Objective 3 (G1/O3)
      2. Strategic Goal 2 (G2) Stimulate Innovation and Address Societal Needs through Research and Education
        1. Strategic Objective 1 (G2/O1)
        2. Strategic Objective 2 (G2/02)
      3. Strategic Goal 3 (G3): Excel as a Federal Science Agency
        1. Strategic Objective 1 (G3/O1)
        2. Strategic Objective 2 (G3/O2)
    9. Agency Priority Goals (APG) and Long-term Performance Goals
      1. Priority Goal 1: Increase public access to NSF-funded peer-reviewed publications
      2. Priority Goal 2: Improve the nation’s capacity in data science by investing in the development of human capital and infrastructure
    10. Appendix I -Strategies and Means
      1. Introduction
        1. Core Strategies
        2. Means for Carrying Out Core Strategies
      2. Measuring Performance
        1. Core Approaches for Measuring Performance
        2. New Approaches to Evaluation and Assessment
    11. Appendix II -Contributing Programs for Strategic Objectives
      1. List of NSF Strategic Goals and Objectives 2014-2018
      2. NSF Priority Goals
    12. List of Programs
      1. Biological Sciences (BIO)
      2. Computer and Information Science and Engineering (CISE)
      3. Engineering (ENG)
      4. Geosciences (GEO)
        1. DIRECTORATE FOR GEOSCIENCES (GEO)
          1. Table GEO Funding (Dollars in Millions)​
          2. About GEO
          3. Figure GEO Subactivity Funding (Dollars in Millions)
          4. FY 2014 Summary by Division
          5. Major Investments
          6. Table GEO Major Investments (Dollars in Millions)
          7. GEO Funding for Centers Programs and Facilities
          8. Table GEO Funding for Centers Programs (Dollars in Millions)
          9. Table GEO Funding for Facilities (Dollars in Millions)
          10. Summary and Funding Profile
          11. Table GEO Funding Profile
          12. Program Monitoring and Evaluation
          13. Table Number of People Involved in GEO Activities
        2. DIVISION OF ATMOSPHERIC AND GEOSPACE SCIENCES (AGS)
          1. Table AGS Funding(Dollars in Millions)​
          2. FY 2014 Summary
        3. DIVISION OF EARTH SCIENCES (EAR)
          1. Table EAR Funding(Dollars in Millions)
          2. FY 2014 Summary
        4. DIVISION OF INTEGRATIVE AND COLLABORATIVE EDUCATION AND RESEARCH (ICER)
          1. Table ICER Funding(Dollars in Millions)
          2. FY 2014 Summary
        5. DIVISION OF OCEAN SCIENCES (OCE)
          1. Table​ OCE Funding (Dollars in Millions)
          2. FY 2014 Summary
        6. DIVISION OF POLAR PROGRAMS (PLR)
          1. Table​ PLR Funding (Dollars in Millions)
          2. FY 2014 Summary
        7. References
          1. GEO 1
          2. GEO 2
          3. GEO 3
          4. GEO 4
          5. GEO 5
          6. GEO 6
          7. GEO 7
          8. GEO 8
          9. GEO 9
          10. GEO 10
          11. GEO 11
          12. GEO 12
          13. GEO 13
          14. GEO 14
      5. Mathematical and Physical Sciences (MPS)
      6. Social, Behavioral and Economic Sciences (SBE)
      7. International and Integrative Activities (IIA)
      8. United States Arctic Research Commission (USARC)
      9. Education and Human Resources (EHR)
      10. Major Research Equipment and Facilities Construction (MREFC)
      11. Agency Operations and Award Management (AOAM)
      12. Office of the National Science Board (NSB)
      13. Office of Inspector General (OIG)
  6. FY 2013 NSF Budget Request to Congress Performance
    1. NSF Performance Context
      1. Mission Statement
      2. Strategic Plan and Performance Goals
        1. Table 1
      3. GPRA Modernization Act of 2010
    2. Agency Priority Goals
      1. FY 2012-FY 2013 Agency Priority Goals
        1. FY 2012-FY 2013 NSF Agency Priority Goals
          1. Table 2
      2. FY 2012-FY 2013 Priority Goal: Access to Digital Products of NSF-Funded Research
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Relevant Strategic Objective/Performance Goal
        5. Description
        6. Contributing Programs
      3. FY 2012-FY 2013 Priority Goal: Undergraduate Programs
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Relevant Strategic Objective/Performance Goal
        5. Description
        6. Contributing Programs
      4. FY 2012-FY 2013 Priority Goal: NSF Innovation Corps
        1. Impact Statement
        2. Goal Statement
        3. Lead Organization
        4. Objective/Performance Goal
        5. Description
        6. Contributing Programs
    3. FY 2010-FY 2011 Priority Goal Report
      1. FY 2010-FY2011 Priority Goal: Science, Technology, Engineering and Mathematics (STEM) Workforce Development
        1. Goal Statement
        2. Lead Organization
        3. Relevant Strategic Objective/Performance Goal
        4. Result: Achieved. Twelve programs reached the target
          1. Table 3
        5. Lessons Learned and Future Directions
    4. FY 2012 – FY 2013 Annual Performance Plan
      1. Introduction
        1. Table 4
      2. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-1
        2. Strategic Target
        3. Goal T-1.1 INSPIRE (Integrated NSF Support Promoting Interdisciplinary Research and Education)
          1. Table 5
      3. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Goal T-2.1 PRIORITY GOAL: Undergraduate Programs
          1. Table 6
      4. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Goal T-2.2 Career-Life Balance
          1. Table 7
      5. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-3
        2. Strategic Target
        3. Goal T-3.1 International Implications
          1. Table 8
      6. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Goal T-4.1 Construction Project Monitoring
          1. Table 9
        4. Construction Project Monitoring performance trends, FY 2006-FY 2011
          1. Figure 1
      7. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Goal T-4.2 PRIORITY GOAL: Access to Digital Products of NSF-Funded Research
          1. Table 10
      8. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Goal I-1.1 PRIORITY GOAL: Innovation Corps
          1. Table 11
      9. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Goal I-1.2 Industrial and Innovation Partnerships
          1. Table 12
      10. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Goal I-2.1 Public Understanding and Communication of Science and Engineering
          1. Table 13
      11. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Goal I-2.2 K-12 Components
          1. Table 14
      12. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Goal I-3.1 Innovative Learning Systems
          1. Table 15
      13. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.1 Model EEO Agency
          1. Table 16
      14. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.2 IPA Performance Plans
          1. Table 17
      15. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Goal M-1.3 Performance Management System
          1. Table 18
      16. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-2
        2. Strategic Target
        3. Goal M-2.1 Assess Developmental Needs
          1. Table 19
      17. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.1 Financial System Modernization
          1. Table 20
      18. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.2 Time-to-Decision
          1. Table 21
        4. Time to decision performance trends, FY 2006-FY 2011
          1. Figure 2
      19. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Goal M-3.3 Virtual Merit Review Panels
          1. Table 22
      20. Other Goals, Priorities, and Activities
        1. Changes from FY 2011
        2. Table 23
        3. Lower-Priority Programs
        4. Burden Reduction/Unnecessary Plans and Reports to Congress
        5. Management Challenges
    5. FY 2011 Annual Performance Report
      1. Table 24
      2. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-1
        2. Strategic Target
        3. Lead Organization
        4. Goal T-1.1 Potentially Transformative Research (PTR)
          1. Table 25
        5. Discussion
      3. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-2
        2. Strategic Target
        3. Lead Organization
        4. Goal T-2.1 STEM Workforce Priority Goal
          1. Table 26
        5. Discussion
      4. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-3
        2. Strategic Target
        3. Lead Organization
        4. Goal T-3.1 International Implications
          1. Table 27
        5. Discussion
      5. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Lead Organization
        4. Goal T-4.1 Construction Project Monitoring
          1. Table 28
        5. Construction Project Monitoring Performance Trends, FY 2006-FY 2011
          1. Figure 3
      6. Strategic Goal 1: Transform the Frontiers
        1. Strategic Objective/Performance Goal T-4
        2. Strategic Target
        3. Lead Organization
        4. Goal T-4.2 Data Management Practices at Large Facilities
          1. Table 29
      7. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-1
        2. Strategic Target
        3. Lead Organization
        4. Goal I-1.1 IIP Grantees’ Partnerships
          1. Table 30
          2. Table 31
      8. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Lead Organization
        4. Goal I-2.1 Public Understanding and Communication
          1. Table 32
      9. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-2
        2. Strategic Target
        3. Lead Organization
        4. Goal I-2.2 K-12 Components
          1. Table 33
      10. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Lead Organization
        4. Goal I-3.1 Innovative Learning Systems
          1. Table 34
      11. Strategic Goal 2: Innovate for Society
        1. Strategic Objective/Performance Goal I-3
        2. Strategic Target
        3. Lead Organization
        4. Goal I-3.2 Partnerships for Learning Technologies
          1. Table 35
      12. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.1 Intergovernmental Personnel Agreement (IPA) Performance Plans
          1. Table 36
      13. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.2 360 Degree Evaluation Instrument
          1. Table 37
        5. Information on delayed achievement
      14. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-1
        2. Strategic Target
        3. Lead Organization
        4. Goal M-1.3 Model EEO Agency
          1. Table 38
      15. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Goal 3
        2. Strategic Objective/Performance Goal M-2
        3. Strategic Target
        4. Lead Organization
        5. Goal M-2.1 Assess Developmental Needs
          1. Table 39
        6. Information on delayed achievement
      16. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Lead Organization
        4. Goal M-3.1 Grant-By-Grant Payments
          1. Table 40
        5. Information on delayed achievement
      17. Strategic Goal 3: Perform as a Model Organization
        1. Strategic Objective/Performance Goal M-3
        2. Strategic Target
        3. Lead Organization
        4. Goal M-3.2 Time to Decision
          1. Table 41
        5. Time to Decision Performance Trends, FY 2006-FY 2011
          1. Figure 4
    6. FY 2011 External Program Evaluations List
      1. External Evaluations Conducted by Contractors
      2. DIRECTORATE FOR ENGINEERING
        1. Division of Engineering Education and Centers (EEC)
          1. Table 42
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Recommendations
          6. Agency response to recommendations
          7. Publications
          8. Actual Cost
      3. DIRECTORATE FOR ENGINEERING
        1. Division of Engineering Education and Centers (EEC)
          1. Table 43
          2. Program Description
          3. Recommendations
          4. Agency response to recommendations
          5. Publications
          6. Actual Cost
      4. DIRECTORATE FOR ENGINEERING
        1. Emerging Frontiers in Research and Innovation (EFRI) Office
          1. Table 44
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      5. DIRECTORATE FOR ENGINEERING
        1. Division of Civil, Mechanical and Manufacturing Innovation (CMMI)
          1. Table 45
          2. Program Description
          3. Evaluation Description
          4. Findings
          5. Recommendations
          6. Agency response to recommendations
          7. Publications
          8. Actual Cost
      6. DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES
        1. Division of Mathematical Sciences (DMS)
          1. Table 46
          2. Program Description
          3. Evaluation Description
          4. Findings and Recommendations
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      7. DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES
        1. Division of Chemistry (CHE)
          1. Table 47
          2. Program Description
          3. Evaluation Description
          4. Findings and Recommendations
          5. Agency response to recommendations
          6. Publications
          7. Actual Cost
      8. OFFICE OF POLAR PROGRAMS
        1. Table 48
        2. Program Description
        3. Evaluation Description
        4. Findings
        5. Recommendations
        6. Agency response to recommendations
        7. Publications
        8. Actual Cost
      9. Committees of Visitors (COVs)
    7. Strategic Plan Addendum
      1. FY 2012-FY 2013 Agency Priority Goals
        1. Access to Digital Products of NSF-Funded Research
          1. Table 49
        2. Undergraduate Programs
        3. Innovation Corps
      2. FY 2012-FY 2013 Federal Priority Goals
      3. Congressional Consultations
    8. Supporting Information
      1. Annual Performance Plan
        1. Strategies and Supporting Analysis
        2. Strategic Human Capital Management at NSF
      2. Annual Performance Report
        1. Verification and Validation of Performance Goals
        2. Data Sources, Limitations, and Intended Use
    9. FY 2010 – FY 2011 Priority Goal Report

Story

An NSF for Data Scientists, Data Infrastructure, and Data Publications

The National Science Foundation Strategic Plan for 2014 – 2018 says:

NSF Agency Priority Goals:

  • Increase Data Scientists and Data Infrastructure;
  • Optimize the Award Process to Level Workload;
  • Ensure Public Access to Publications

NSF has identified three agency priority goals for FY 2015: increase public access to NSF-funded peer-reviewed publications; improve agency and awardee efficiency by leveling the award of grants across the fiscal year; and build multi-institutional partnerships to increase the number of U.S. data scientists and the usefulness and ease of use of data infrastructure. Complete descriptions can be found in NSF’s Annual Performance Plan.

NSF Portfolio Metrics: NSF maintains an extensive data base describing metrics that characterize our portfolio of awards. This includes dozens of parameters, from numbers of proposals/awards to numbers of investigators/students to diversity statistics to success rates. These metrics are good indicators for many of the performance goals.​

Semantic Community is Data Scientists building Data Infrastructure to Improve Public Access to Publications:

Semantic Community is participating in the NSF Grants Funding Process:

Semantic Community​ has repurposed the NSF Strategic and Performance Plans (PDF) and NSF Publications (Web) into a Data Publication in a Data Browser using a Spreadsheet (Excel) and Spotfire (Web Player).

The NSF Directorate for Geosciences Data Policies (See Below) says: "we expect some change, some evolution, some creativity, and some individuality as the culture of sharing grows and we become better stewards of Geoscience data."

Semantic Community is showing some change, evolution, creativity, and individuality in Data Publications for Geosciences (GEO) and EarthCube.

Details of: NSF Publications and NSF Data Tables in a Data Browser are provided below.

We have proposed a presentation to the NSF FASTER CoP:

FASTER’s goal is to enhance collaboration and accelerate agencies’ adoption of advanced IT capabilities developed by Government-sponsored IT research. FASTER hosts Expedition and Emerging Technology workshops as well as monthly meetings with invited guest speakers to achieve this goal.

NITRD created FASTER for Federal agency CIOs and/or their advanced technology specialists. FASTER, seeks to accelerate deployment of promising research technologies; share protocol information, standards, and best practices; and coordinate and disseminate technology assessment and testbed results. The Federal CIO Council under the leadership of the Office of Management and Budget (OMB) coordinates the use of IT systems. NITRD coordinates federally supported IT research under the leadership of OSTP (with OMB participation). FASTER, supported by the NITRD NCO, communicates with OMB and the Federal CIO Council concerning IT R&D matters that are of general interest to Federal agencies.

FASTER is responding to the Open Government Directive by using the technologies of the Social Data Web (e.g., Linked Open Data and the Semantic Web).

because we think our works addresses those three items as follows:

  • We enhance collaboration and accelerate agencies’ adoption of advanced IT capabilities with monthly meetings with invited guest speakers;
  • We seek to accelerate deployment of testbed (we call it sandbox) results; and
  • We use the technologies of the Social Data Web (e.g., Linked Open Data and the Semantic Web).

MORE TO FOLLOW

NSF Directorate for Geosciences Data Policies

Source: http://www.nsf.gov/geo/geo-data-policies/index.jsp​​

The NSF brings transparency to its long-standing policy on dissemination and sharing of research results through the required inclusion of a data management plan with each project submission:

http://www.nsf.gov/pubs/policydocs/pappguide/nsf11001/gpg_2.jsp#dmp.

Plans for the dissemination and sharing of research results will be traceable from the beginning to the end of a project (proposal, review, and annual/final report). The primary goal of this procedural change is to assure that products of research help NSF achieve its mission to promote the progress of science and engineering.

The National Science Board produced a 2005 report on “Long-Lived Digital Data Collections" which recognized that science has become increasingly data-intensive, requiring new kinds of collaborations and techniques. The new requirement that all research proposals include a Data Management Plan effectively implements a recommendation from this report. The NSB report represents one of several influential reports that note that major advances in technology are driving rapid changes in the scale, scope and complexity of science and engineering data collections. These changes can create significant obstacles for communities that strive to make the products of science openly accessible. If access to data and knowledge are to be as broad as possible, cultural and institutional changes must keep pace with technology changes.

Resources
The National Science Foundation maintains a website of resources that will help in the preparation of compliant and useful data management plans, including a set of Frequently Asked Questions. Geosciences complements these resources with information specific to geosciences research and programs, including GEO data guidance and community data resources. 

NSF: Dissemination and Sharing of Research Results
NSF: Data Management & Sharing Frequently Asked Questions (FAQs)
GEO Divsion Data Policies and Information

Community Feedback

It’s important to recognize that data sharing is evolving, and we do not have all the answers yet. Just like the introduction of the Broader Impacts criteria 1997, we expect some change, some evolution, some creativity, and some individuality as the culture of sharing grows and we become better stewards of Geoscience data.

The NSB acknowledges that issues surrounding data sharing and management are complex. These issues include broad and timely access to data, sustainability of data, cost burdens associated with data management, and openness of data generated with taxpayer dollars, to name a few. An NSB Task Force on Data Policies was established in February 2010 with the charge of further defining the issues and outlining possible options to make the use of data more effective in meeting NSF’s mission. This Task Force will be carefully considering the impact of the new requirement for Data Management Plans.

GEO seeks to add to this dialog with feedback from its community regarding this new requirement. A special email address, geo-DMP-feedback@nsf.gov, is available for providing comment or submitting questions. Please, also contact your program director for questions

NSF Publications

Web Site

Source:​Publications and Publications Example Below

News releases posted after June 23, 2009 can be found in the News section.

  

 

 Get Publications Updates by Email NSF News RSS Feed What is RSS?

Viewing: Current Publications | Archived Publications 
Sorted By: Publication Date | Publication Title 
Page: Previous | Next ( Showing: 1 - 30 of 3408 )

 

Science of Science and Innovation Policy Doctoral Dissertation Research Improvement Grants (SciSIP-DDRIG)
Available Formats: HTML | PDF | TXT 
Document Type: Program Announcements & Information View Program Page
Document Number: nsf14578 
Document History: Posted: June 23, 2014.

Spreadsheet Matrix of Publications

All NSF Organizations All Publication Types General Information News Releases Newsletters / Journals NSB Board Reports Policies & Procedures Program Announcements & Information Reports Summary of Awards
All NSF Organizations 3408 282 660 323 72 372 706 434 64
Biology (BIO) 959 202 142 62 7 311 168 163 15
Computer, Info. Sci., Eng. (CISE) 927 202 82 60 7 312 225 127 21
Education (EHR) 974 213 78 70 7 311 185 177 39
Engineering (ENG) 940 206 77 66 8 314 243 128 12
Geosciences (GEO) 1211 222 196 232 7 322 219 130 14
International and Integrative Activities(IIA) 795 198 37 58 7 309 116 166 11
Math, Physical Science (MPS) 923 207 96 78 7 312 201 129 10
National Science Board (NSB) 159 10 7 0 70 5 1 41 0
Office of Inspector General (OIG) 766 196 33 58 6 308 72 188 0
Social, Behavioral, Econ. (SBE) 1509 205 77 58 10 311 180 149 11
Totals 9163 1861 825 742 136 2815 1610 1398 133

 

Spreadsheet Table Format

Title Available Formats Document Type Document Number Document History
Science of Science and Innovation Policy Doctoral Dissertation Research Improvement Grants (SciSIP-DDRIG) HTML | PDF | TXT Program Announcements & Information View Program Page nsf14578 Posted: June 23, 2014
......        
.....        
Total: 3408        

My Note: Spreadsheet in Process

NSF Data Tables in a Data Browser

All the PDF Tables (77) were converted to MindTouch Wiki Tables with no <br> so they could be readily copied to Excel and used in a Data Publication in a Data Browser (Spotfire).

The question is: What is the most useful format for 77 tables in Spotfire? I have started with Slide 10 NSF Strategic Plan Tables and then Slide 11 NSF GEO Tables and Slide 12 NSF Performance Plan Tables (in process)

All 77 tables can be viewed in the MindTouch Wiki Knowledge Base by using the Google Chrome Find: Table.

We have prepared the 3000+ NSF publications so they can be read and reasoned over by the Semantic Insights Research Assistant (SIRA) that has made considerable progress and was submitted for the DARPA Big Mechanism Project

This may become the Semantic Medline for NSF Publications! Semantic NSFPub:)

​The NSF Strategic Plan Knowledge Base is an initial ontology.

MORE TO FOLLOW​

We also have some experts that just joined our Meetup with health domain ontology applications and they are preparing a joint presentation. See OMB Ontology and Ontologizing Memo below. In Summary:

 An Ontology:

  • is a formal representation of meaning in an information system;
  • creates the bridge between the internal world of the computer and the external world of people’s understanding;
  • provides an inter lingua between disparate data sources and knowledge bases;
  • allows us to build useful and usable systems for complex tasks in health care.

Remember:

  • don’t try to divorce the Ontology from its application (the ‘universal ontology’)
  • building and embedding an Ontology in a useful application has pitfalls that require judgment, experience, clarity of purpose, and resources.

The quality of the Ontology is measured by how well the internal workings correspond to human understanding.

We create Ontologies to help us build useful and usable information systems.

But this ‘Ontologizing’ is much more than a mere change of format or standardization of the data. Once the many sources and knowledge bases are represented using the same Ontology, the relationships amongst them can be manipulated and a far richer and more complete picture obtained.

It is analogous to translating related ancient texts into the same language. The relationships between people, events, discoveries, and beliefs come together to form a complete picture of their world. The whole is far greater than the sum of the parts.

Ontologies are not magic. They need to be built and maintained by people.

A practical Ontology needs a purpose that is wide enough to be useful but narrow enough to be realizable.

By using an Ontology to ‘understand’ the data it can be organized in multiple, complementary ways according to need.

An Ontology on its own does nothing for people. Embedding an Ontology in an application releases its value. This is a specialized task that takes significant resources, which developers of ‘standard ontologies’ always underestimate. All too often the struggle to incorporate the ‘general purpose standard’ becomes an obstacle to system development rather than an aid.

Ontology and Ontologizing

Essential Elements in the Link between Health Data and Value 1

1 This paper was downloaded on February 22, 2012 from the Office of Management and Budget's (OMB) Federal Enterprise Architecture (FEA) site.  URL: http://www.whitehouse.gov/sites/defa...8_102609-1.pdf​ (PDF)

In computer and information science, an Ontology is a formal representation of a set of concepts and the relationships among them within a domain such as health care. This is how the computer and information system “understand” the data.

So why do they matter?

Why you need an Ontology

If you wish to construct a useful and usable information system that performs complex, broad tasks such as compiling, maintaining, organizing, and presenting a person’s health record and driving protocols of care across many providers, for many types of user, then you must employ an Ontology. If you don’t, then the computer can do little to help. The result will be a disorganized aggregation of data that no one finds useful.

An Ontology can give us a formal representation of meaning in an information system, including the relationships among concepts; create a bridge between the computer and human understandings of the data and relationships; provide a means for understanding the relationships among data from disparate sources; and provide an understood and usable database on which to perform complex tasks.

What is an Ontology made of?

An ontology is a collection (‘database’) of concepts, such as types of diseases, medications, blood tests, symptoms, and social circumstances, and the relationships between them. For example:

‘Influenza’-> ‘is a kind of’->‘Viral infection’

Influenza and Viral infection are concepts and is a kind of is a relationship that allows concepts to be formed into hierarchies. Likewise:

‘Toe’->‘is part of’->‘Foot’

A computer can then use a basic set of formal rules (‘software’) to reason with the concepts and their relationships (sort, select, and generally manipulate them) to do useful things.

The formal rules for manipulating concepts are a kind of algebra for the symbols. The best known are the Description Logics, for which there are now increasingly sophisticated computer tools, driven by the need to build more ‘meaning’ into the World Wide Web. (e.g. find me web pages that show some kind of Politician in some kind of Embarrassing situation).

An Ontology links Formal meaning in computers with Human understanding

An ontology imparts formal meaning to the information and builds the bridge between the internal workings of the computer and the external world of human understanding. Within the computer the concepts and relationships are represented by strings of characters (symbols) such as ‘123XY7DQM9’, which it manipulates but are meaningless to people. A language term, such as ‘influenza’, is attached to a symbol. Although it is meaningless to the computer it allows people to elucidate the intended meaning of the computer’s symbol/representation in the real world.

The computer manipulates the symbols according to formal rules and the results are made intelligible to people by converting the outputs into language phrases that appear on the screen. Multiple languages (English, Spanish, etc) can be attached to the same set of formal concepts to create an intrinsically multilingual system.

The quality of the Ontology is measured by how well the internal workings correspond to human understanding. For example if Mrs Smith has influenza then the computer should conclude she has a kind of viral disease and not a kind of broken leg.

An Ontology help us assemble data from diverse sources into a meaningful whole

We create Ontologies to help us build useful and usable information systems. A sufficiently rich Ontology forms an inter lingua between the many types of patient data sources and knowledge bases that are found in health care. Systems such as laboratory systems and health plan systems hold and transmit data in their own format. This can be transformed into the concepts represented in an Ontology (blood tests, diseases, medications, etc).

But this ‘Ontologizing’ is much more than a mere change of format or standardization of the data. Once the many sources and knowledge bases are represented using the same Ontology, the relationships amongst them can be manipulated and a far richer and more complete picture obtained. Useful questions can then be answered such as “show me the information pertinent to this person’s diabetes” rather than just “show me the data from St Elsewhere’s lab system”.

It is analogous to translating related ancient texts into the same language. The relationships between people, events, discoveries, and beliefs come together to form a complete picture of their world. The whole is far greater than the sum of the parts.

Creating a Useful Ontology and Also How to Fail

There have been several very large-scale efforts at constructing large vocabularies that aspire to be formal Ontologies. Practical attempts to make use of them have been remarkably disappointing. Ontologies are not magic. They need to be built and maintained by people. There are several dos and don’ts that are commonly ignored.

Scope: bigger isn’t necessarily better

The wider the scope the greater the complexity of an Ontology, and the harder it becomes to build and use the thing. Attempts to create the ‘universal ontology for health’ for all purposes for all people have all collapsed under their own complexity.

A practical Ontology needs a purpose that is wide enough to be useful but narrow enough to be realizable. This is an empirical matter of judgment and experience, coupled with a clear sense of purpose.

Ontologizing data is not standardizing health care

A common mistake is to confuse building and using an Ontology with standardizing health care. For example ontology developers demand that clinicians agree on a single way of organizing drugs into categories for all purposes for all time before constructing the Ontology and hence the system. This is both unnecessary and impossible. In fact a key role of an Ontology is to cope with the often legitimate real-world diversity in health care. By using an Ontology to ‘understand’ the data it can be organized in multiple, complementary ways according to need. A sound Ontology can support standardization of practice but doesn’t itself demand it. Unnecessary arguments over the ‘one true way’ destroy projects.

An Ontology is only as valuable as it’s applications

An Ontology on its own does nothing for people. Embedding an Ontology in an application releases its value. This is a specialized task that takes significant resources, which developers of ‘standard ontologies’ always underestimate. All too often the struggle to incorporate the ‘general purpose standard’ becomes an obstacle to system development rather than an aid.

In Summary

An Ontology:

  • is a formal representation of meaning in an information system;
  • creates the bridge between the internal world of the computer and the external world of people’s understanding;
  • provides an inter lingua between disparate data sources and knowledge bases;
  • allows us to build useful and usable systems for complex tasks in health care.

Remember:

  • don’t try to divorce the Ontology from its application (the ‘universal ontology’)
  • building and embedding an Ontology in a useful application has pitfalls that require judgment, experience, clarity of purpose, and resources.

Slides

Slide 1 NSF Funding & Research Community

http://www.nsf.gov/index.jsp

NSFFunding&ResearchCommunity-WebSite.png

Slide 2 NSF Strategic Plan Description

http://www.nsf.gov/about/performance...tegic_plan.jsp

NSFStrategicPlan-Description.png

Slide 3 NSF Strategic Plan Available Formats

http://www.nsf.gov/publications/pub_...s_key=nsf14043

NSFStrategicPlan-AvailableFormats.png

Slide 6 NSF Strategic Plan Spreadsheet Knowledge Base

http://semanticommunity.info/@api/deki/files/29692/NSFProposalGuide.xlsx

NSFStrategicPlan-ExcelKnowledgeBase.png

Slide 7 NSF Strategic Plan Wiki Knowledge Base

http://semanticommunity.info/Data_Science/NSF_Strategic_Plan

NSFStrategicPlan-WikiKnowledgeBase.png

Slide 8 NSF Strategic Plan Spotfire Cover Page

Web Player

NSFStrategicPlan-Spotfire-CoverPage.png

Slide 9 NSF Publications

NSFStrategicPlan-Spotfire-Publications.png

Slide 10 NSF Strategic Plan Tables

NSFStrategicPlan-Spotfire-NSFStrategicPlanTables.png

Slide 11 NSF GEO Tables

 

Slide 12 NSF Performance Plan Tables

 

Spotfire Dashboard

For Internet Explorer Users and Those Wanting Full Screen Display Use: Web Player Get Spotfire for iPad App

Research Notes

Source: http://www.nsf.gov/about/performance...tegic_plan.jsp 

Budget and Performance
Investing in Science, Engineering, and Education for the Nation's Future: NSF Strategic Plan for 2014 - 2018

NSF's new Strategic Plan, Investing in Science, Engineering, and Education for the Nation's Future: NSF Strategic Plan for 2014 - 2018, was released on March 10, 2014. It lays out two strategic goals that embody the dual nature of NSF's mission to advance the progress of science while benefitting the Nation: Transform the Frontiers of Science and Engineering and Stimulate Innovation and Address Societal Needs through Research and Education. A third goal, Excel as a Federal Science Agency, directs NSF to hold itself accountable for achieving excellence in carrying out its mission. This goal structure enables NSF to link its investments to longer-term outcomes. To bridge the gap between these strategic goals and measurable outputs, the Strategic Plan establishes a set of strategic objectives for each strategic goal.

Source: http://www.nsf.gov/publications/pub_...s_key=nsf14043

Available Formats: PDF 
Document Type: Reports 
Document Number: nsf14043 

Document History: Posted: March 10, 2014. Replaces: nsf11047.

National Science Foundation Strategic Plan for 2014 – 2018

Investing in Science, Engineering, and Education for the Nation’s Future

Strategic Plan for 2014 – 2018

March 2014

Summary

The National Science Foundation (NSF) is an independent Federal agency that supports fundamental research at the frontiers of knowledge, across all fields of science and engineering (S&E) and S&E education. With an annual budget of about $7.2 billion (fiscal year 2014), the agency is the funding source for approximately 24 percent of all federally supported fundamental research conducted by America’s colleges and universities. Each year, NSF receives about 50,000 competitive requests for funding proposals for research, education, and training projects, and we make about 11,500 new funding awards. To ensure that proposals are reviewed in a fair, competitive, transparent, and in-depth manner, the agency developed a rigorous merit review process that is widely recognized as the “gold standard” of evaluation. NSF’s merit review uses two criteria to evaluate the proposed activity—its intellectual merit (meaning the potential to advance knowledge) and the broader impacts (encompassing the potential to benefit society and contribute to achieving specific, desired societal outcomes). 

In FY 2012, NSF funding reached all 50 states, primarily through grants to nearly 2,000 colleges, universities, and other institutions. An estimated 319,000 people, including researchers, postdoctoral fellows, trainees, teachers, and students were supported by NSF awards. In addition, NSF supports an advanced research infrastructure that includes research vessels, astronomical observatories, particle accelerators, seismic observatories, U.S. research stations in Antarctica, advanced cyberinfrastructure, and more. Over the years, NSF-funded research and education projects and world-class S&E infrastructure have led to many significant discoveries. More than 200 Nobel Prize winners received support from NSF at some point in their careers, and the highly acclaimed achievements of these laureates are but a small fraction of the advances enabled by NSF--advances that, in turn, have stimulated economic growth and improved the quality of life, health, and security for all Americans.

NSF is committed to the principles underlying open government, including transparency, public participation, and collaboration with other government agencies and private institutions. Our stakeholders include students, faculties, and administrators of academic institutions at all levels (K-12 schools, two-year and community colleges, and undergraduate and graduate colleges and universities); not-for-profit institutions such as zoos and museums; businesses and industries engaged in science, technology, engineering, and mathematics (STEM) research, development, and manufacturing; state and other government agencies; NSF staff; the National Science Board (NSB); Congress, the Administration and other science policy makers; and the American public. The agency uses advice from a broad spectrum of recognized deliberative bodies, including the NSB, the National Academies, our advisory committees, and the research and education 2 National Science Foundation Strategic Plan for 2014 – 2018 communities we serve. Through traditional means--such as program announcements and solicitations, conferences and workshops, advisory committees, congressional testimony, websites, press releases, and reports--and through new or emerging communications means such as social media, we inform advisers and stakeholders about what NSF is doing, and we incorporate their input into our programs and planning to achieve our mission.

Mission

Our mission was established by Congress in legislation that created the agency. The NSF Act of 1950 (Public Law 81-507) sets forth the mission: “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes.”

Investing in S&E has become widely recognized as an essential strategy to ensure the Nation’s future prosperity. Discoveries made possible through NSF-supported fundamental S&E research are key to sustaining the health of the Nation’s “innovation ecosystem,” a concept that expresses the importance of an exponentially growing mix of ideas and techniques from all fields, together with highly trained knowledge workers capable of answering the two questions at the heart of innovation: What is needed, and what is possible? While certain S&E discoveries may be used directly to meet pressing societal needs, some discoveries may be combined with other fundamental advances, sometimes made decades apart, to produce new technologies that transform our lives. Examples of such technologies include the Global Positioning System (GPS), medical diagnostic and therapeutic technologies, and the cyber revolution. Still other S&E discoveries may have no apparent technological application, but still contribute in an important way to the knowledge base from which all innovation is derived. Therefore, NSF’s commitment to advance the frontiers of science and engineering ensures the sustained vigor of both fundamental research and the Nation’s innovation ecosystem as a means to maintaining global leadership throughout the 21st century.

In order to unleash the Nation’s innovation potential, a well-prepared knowledge workforce-one that is steeped in the expanding knowledge base and the advanced technology being generated by fundamental research activities--is essential. NSF meets this need by seamlessly integrating the education of future scientists, engineers, and educators into the broad portfolio of research that we support. This investment strategy generates not only groundbreaking S&E discoveries, but it also equips the future S&E workforce with the knowledge and experience to apply the most advanced concepts and technology to meet societal challenges.

NSF Vision

A Nation that creates and exploits new concepts in science and engineering and provides global leadership in research and education.

NSF Core Values

NSF’s core values are essential and enduring tenets that guide everyone in the organization in support of our mission. NSF maintains the core values of:

Scientific Excellence – engaging the vision and expertise of our staff with that of the scientific community to create a portfolio of awards that support transformation and innovation;

Organizational Excellence – investing the resources entrusted to us optimally and efficiently, and realizing the full potential of our people in managing a capable, motivated, inclusive, and positive work environment;

Learning – continually identifying opportunities for learning and professional growth inside and outside the agency, and sharing our best insights with others;

Inclusiveness – seeking and embracing contributions from all sources, including underrepresented groups, regions, and institutions;

Accountability for Public Benefit – operating with integrity and transparency, and maintaining the highest standards of performance in administration, business processes, management, and oversight, thereby providing the best value to the U. S. taxpayer.

Strategic Planning In a Dynamic, Global Context

NSF has the responsibility to be a steward of the Nation’s research and education enterprise in the midst of changing conditions that materially affect its success. External factors that affect the enterprise include the 2008-2009 global economic crisis; the increased need for a creative, expert, and adaptable STEM workforce; and growing capabilities abroad. New opportunities are emerging and technologies are arising across all disciplines. These include crowdsourcing; new materials and sensors; breakthroughs in neuroscience, data analysis, and cloud and mobile computing--all are critical for advancing national research. In formulating the strategic goals, objectives, strategies and means, and assessment sections that follow, NSF maintains a strong focus on carrying out our mission in a way that is sufficiently flexible to meet the changing requirements of the research and education enterprise as well as to address emerging and pressing societal challenges.

At stake is the competitive strength of the Nation in the coming decades. NSF is not alone in this view. Many authoritative and diverse accounts of the increasing need to meet global challenges to U.S. economic and scientific leadership have drawn attention to the importance of continuing to invest in science that will yield new knowledge and improve the human condition.

Stakeholder Engagement

The process of developing this strategic plan involved obtaining input from NSF’s many stakeholder communities. To help shape the plan’s approach and strategic directions, NSF sought input from our staff at all levels and our numerous advisory committees that represent the scientific community. The NSB was briefed on several occasions and had opportunities to provide comments. In addition, congressional staff were given an opportunity to provide comments on the draft plan.

To the extent possible, this plan draws on quantitative and qualitative inputs derived from ongoing data collections and evaluations of NSF’s core business practices (e.g., proposal dwell time and numbers of virtual panels) as well as from reports from Committees of Visitors, NSF Advisory Committees, the NSB, the National Research Council, and others. Looking ahead, there are a number of aspects of the process of developing this plan that we will use to inform the development of future such plans for NSF. For example, the agency explored innovative approaches to engaging our staff and seeking feedback on key aspects of the plan; these included interactive poster sessions and a variety of town-hall types of settings. These efforts were also informed by the increasing NSF-wide emphasis on evaluation and the systematic monitoring of performance. The new approaches were undertaken to ensure that the agency is well-positioned to implement the new requirements for the structured reviews of the strategic objectives (below) established in the plan.

Strategic Goals

NSF uses three strategic goals--Transform the frontiers of science and engineering; Stimulate innovation and address societal needs through research and education; and Excel as a Federal Science Agency--to guide the individual and collective efforts involved in achieving the agency’s mission. The first two goals embody the dual nature of NSF’s mission--(1) advancing the progress of science while (2) benefitting the Nation--and reflect the primary means by which NSF ensures the preeminence of the research and development (R&D) enterprise that is so essential to the Nation’s future. Our approach to accomplishing strategic goals one and two is based on investing in fundamental research and education projects that are recommended for NSF support by the S&E communities using our gold standard merit review selection process. NSF’s policies and procedures maintain an open system of competition that identifies and enables pursuit of the most promising ideas for major advances. The third goal, to excel as a Federal science agency, directs NSF to hold ourselves accountable for achieving excellence in how we carry out our mission. Each strategic goal is supported by a number of strategic objectives, performance goals, and performance indicators to describe more fully the agency’s operations and business plan.

Strategic Goal 1: Transform the Frontiers of Science and Engineering

The first part of NSF’s mission is “to promote the progress of science” in order to expand and explore the frontiers of human knowledge, to enhance the ability of the Nation to meet the challenges it faces, and to create new paradigms and capabilities for scientific, technological, and (consequently) economic leadership in an increasingly fast-paced, competitive world. NSF supports fundamental, interdisciplinary, high-risk, and potentially transformative research in science and engineering, and the education of the next generation of the STEM workforce to continue this transformation. NSF welcomes proposals for original research, from both individuals and groups, and for novel discovery tools in the form of advanced instrumentation, data analysis, computation, and facilities. By relying on the insight and creativity of the Nation’s research communities, the merit review process, and efforts to broaden participation, NSF strives to create opportunities for major transformational advances. Proposals, whether submitted in response to open funding opportunities or for formal targeted solicitations, are competitively merit reviewed for intellectual merit and broader impacts by independent subject matter experts. Investment in competitively selected proposals continually expands the knowledge base from which innovation springs, in keeping with mandates from the National Science Board, Congress, and the Administration. The special role of NSF is to encourage broadly creative efforts that may not fit within those fields that are the domain of specific mission agencies, e.g., energy, defense, health, or agriculture. This strategy has a long and proven record of producing new ideas, developing important new fields of research, and, periodically, transforming the way we do science, education, and business, and changing the way we live. Instrumental to the creation of new knowledge is the preparation of future research leaders and tomorrow’s STEM workforce who are equipped with up-to-date knowledge and the experience needed to address society’s current and future challenges. Investing in the development of the next generation of researchers, scholars, and knowledge workers is one of NSF’s most important approaches to transforming the frontiers of science and engineering.

Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education

This goal flows from the latter part of the NSF mission statement--“to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes.” While several topics in this broad charge are primary missions of other agencies, NSF makes a unique contribution. Through targeted solicitations and core programs, the agency is able to focus the attention of the broader S&E community on fundamental aspects of high priority national challenges. We guide the community to open up new avenues to address these priorities and we provide funding to pursue promising approaches. This strategic goal echoes the “broader impacts” merit review criterion which was developed by the NSB, and it explicitly requires NSF to engage the community in addressing particularly urgent challenges. The strategy often necessitates the formation of partnerships with industry, other agencies, and international sponsors to build capacity, leverage resources, and increase the speed of translation from discovery to innovation. The educational dimension is a key aspect of this strategic goal. NSF supports research and development on STEM education and learning to prepare a diverse, globally competent STEM workforce and a STEM-literate citizenry.

Strategic Goal 3: Excel as a Federal Science Agency

This goal directs that NSF will integrate mission, vision, and core values to efficiently and effectively execute our activities and provide the flexibility and agility required to meet the quickly evolving challenges associated with the first two strategic goals. Accomplishing goal three entails blending strong scientific leadership with robust organizational leadership, both characterized by vision and flexibility, and also supporting the staff with the information and other resources that are essential to carry out the agency’s activities. This goal incorporates a culture of continuous improvement to ensure effective, inclusive, and accountable programs and merit review processes that provide the greatest value for taxpayer dollars. NSF’s core strength is our people, and the agency is committed to recruiting, retaining, and deepening the expertise and capabilities of our entire workforce. Our idea of inclusiveness embraces a diverse and continually changing workforce. NSF’s commitment to the innovative management of agency operations leverages the creativity of NSF staff with the opportunities provided by advances in information technology. NSF aims to drive improvements in our programs, processes, and systems, while providing high-quality service and support to all of our stakeholders. NSF also strives to align operational plans, budgets, and management practices with agency goals and priorities to create a common vision that permeates the many functions of NSF and enhances the performance of both individuals and internal organizations. Through this strategic goal, NSF will address any management challenges identified by the Inspector General.

Strategic Objectives

Strategic Goal 1 (G1): Transform the Frontiers of Science and Engineering

Strategic Objective 1 (G1/O1)

Invest in fundamental research to ensure significant continuing advances across science, engineering, and education.

This objective encompasses NSF’s largest and most important function – awarding grants to support research. This investment objective has a clear record of producing major new paradigms and technology disruptions that have the power to change our world and impact individual lives. Investments have led to major discoveries recognized by the most highly prized international awards. These types of investments have potentially high payoffs, but are not without risk, as major advances cannot result from every grant. It is rarely possible to predict whether any specific award will generate outcomes with important societal implications. Rather, fundamental research will generate new knowledge that may in the future contribute, often in unpredictable ways, to addressing a national challenge. Often, a long period of incremental advances in knowledge is needed to set the stage for the creative leaps that produce game-changing innovation.

It is vital to the successful realization of this strategic objective that NSF places a high priority on cultivating strong communities of fundamental researchers and intellectual pioneers across the globe, working both as individuals and in a variety of collaborative ways. It is also important that NSF balance its portfolio with a mix of programs and funding 7 National Science Foundation Strategic Plan for 2014 – 2018 modalities to ensure fundamental research is conducted across a wide variety of fields of science, engineering, and education.

Strategic Objective 2 (G1/02)

Integrate education and research to support development of a diverse STEM workforce with cutting-edge capabilities.

The global competitiveness of the United States in the 21st century depends directly on the readiness of the Nation’s STEM workforce. Educational institutions around the country must recruit, train, and prepare a diverse STEM workforce to advance the frontiers of science and participate in the U.S. technology-based economy. One of NSF’s most enduring contributions to the national innovation ecosystem is the integration of education and research in the activities we support. When students participate in cutting-edge research activities under the guidance of the Nation’s most creative scientists and engineers, the students can gain the up-to-date knowledge and practical, hands-on experience needed to develop into creative contributors who can engage in innovative activities throughout all sectors of society. The successive cadres of high-tech workers, each armed with practical knowledge of the most advanced thinking and technology of the day, create the flow of highly adaptable human capital needed to power discovery and innovation. NSF also supports the development of a strong STEM workforce by investing in building the knowledge that informs improvements in STEM teaching and learning. Such improvements include effective curricular and teaching strategies for increased student learning, as well as new approaches enabled by advanced classroom technologies. Investments in social science and education research in learning, teaching, and institutions can have major impacts when derived insights are applied to the education of the STEM workforce.

The transformation of the frontiers of science and engineering requires dramatic change in the diversity of S&E communities. The demographic evolution in the United States is reflected in a strong, growing workforce whose makeup is changing rapidly. Women and members of minority groups represent an expanding portion of the country’s potential intellectual capital. NSF is committed to increasing access for currently underrepresented groups to STEM education and careers through our investments in research and education. The resulting enhancement of diversity is essential to provide the strength that comes from diverse perspectives, as well as to assure development of the Nation’s intellectual capital.

Strategic Objective 3 (G1/O3)

Provide world-class research infrastructure to enable major scientific advances. To fulfill our core mission of “promoting the progress of science,” NSF must provide the research community with advanced and powerful tools and capabilities to keep the Nation’s research enterprise at the global forefront. These tools and capabilities include major research facilities, mid-scale instrumentation, advanced computational and data resources, and cyberinfrastructure. In addition, it is essential to prepare the next-generation workforce to develop, maintain, and employ the infrastructure to advance science. Large facilities hold the promise of major discoveries and revolutionary advances that can propel whole fields forward, thereby justifying significant investment costs. These facilities also are training grounds for the scientists and engineers of tomorrow. Smaller, so-called “mid-scale” 8 National Science Foundation Strategic Plan for 2014 – 2018 instruments are increasingly critical for enabling fundamental research in the experimental sciences; there is an urgent need to adequately provide this category of instrumentation. Advanced computational and data resources and cyberinfrastructure take many forms and are essential to S&E research. Balancing investments in the development and operation of these tools and capabilities with the rest of NSF’s portfolio is a challenging management responsibility. Special challenges derive from life cycle planning, human capital development, and the curation, distribution, and management of the explosion of data being produced in all fields of research. As with all NSF awards, infrastructure projects must meet extremely high standards of scientific merit and broader impacts, and comparable standards of project planning and execution.

Strategic Goal 2 (G2) Stimulate Innovation and Address Societal Needs through Research and Education

Strategic Objective 1 (G2/O1)

Strengthen the links between fundamental research and societal needs through investments and partnerships.

The first part of NSF’s mission, as expanded by the first strategic goal, is to create new knowledge and expand the Nation’s intellectual capital. However, NSF's mission does not end there. We also must connect new knowledge to innovations that address societal needs above and beyond the need for advancement in science. This strategic objective is aimed at developing connections between new insights and global challenges (often involving essential interdisciplinary collaborations, prototypes, and technologies). It also entails educating a workforce capable of using and adapting discoveries to meet society’s needs.

One approach to developing these connections is through partnerships involving other government agencies and private and international entities. Such partnerships leverage NSF resources and help ensure that fundamental research outcomes are translated into benefits to society.

Strategic Objective 2 (G2/02)

Build the capacity of the Nation to address societal challenges using a suite of formal, informal, and broadly available STEM educational mechanisms.

NSF has the opportunity and responsibility to leverage our research and education activities to engage the public and help citizens develop a better understanding of science--one that can inform opinions about issues faced in daily living, in participation in the democratic process, and in helping to advance science. Formal education through the Nation’s K-12 schools provides the foundation for citizens’ understanding of STEM and its uses in addressing the needs of society. This learning continues for those who further their education in the Nation’s colleges and universities. Informal education is another powerful means to provide learning and instill interest in STEM topics in everyone throughout their lives. Technology holds promise for new pathways to learning, including personalized learning. By investing in research and development on STEM education and learning, NSF extends the reach of our programs to the public.

Strategic Goal 3 (G3): Excel as a Federal Science Agency

Strategic Objective 1 (G3/O1)

Build an increasingly diverse, engaged, and high-performing workforce by fostering excellence in recruitment, training, leadership, and management of human capital.

NSF is only as capable as the people who make up the organization. Therefore, recruiting and retaining the best staff in all of our organization’s roles is of utmost importance. This strategic objective focuses on those aspects of recruitment, skill enhancement, leadership, and management of human capital that serve as the foundation for effective support of NSF’s people and mission. NSF recognizes the importance of building diversity in our staff, and ensuring that our staff stays current to match priority mission responsibilities. As an agency at the cutting edge of research and education, NSF also understands that the skills needed to carry out the agency’s work are constantly changing, and, as a result, on-going training is necessary for functions needed today and in the future. An excellent organization requires both vision and direction from our leadership, as well as a constant dialogue with the staff about potential change. It must be a dialogue in which all voices are heard and all contributions are valued. NSF looks to these dimensions of human capital management in order to share with all employees how they play a vital part in maximizing NSF’s performance.

Strategic Objective 2 (G3/O2)

Use effective methods and innovative solutions to achieve excellence in accomplishing the agency’s mission.

NSF can accomplish our mission only when our operational and administrative enterprise functions work seamlessly with the front-line organizations they support. A wide range of services--including human resources; performance management; information technology (IT); financial, procurement, and administrative support--provide the wherewithal for the agency’s program staff and leadership to make critical investments in science, research, engineering, and education.

The agency uses three key strategies to achieve organizational excellence: openness, inclusion, and effectiveness. Openness and inclusion are achieved when we conduct business in a transparent, collaborative, and participatory manner with all stakeholders. Continuous, clear communication with all parties is a hallmark of openness and inclusion. An organization is effective when all business processes work to optimize administrative efficiencies, provide business intelligence for data-driven decision making, and enable organizational agility.

An essential mechanism that NSF uses to accomplish our mission is the competitive merit review of research proposals. We have developed a strong business and operations model that is emulated around the world. This model uses frontline U.S. researchers who have the best sense of where opportunities for major advances lie to evaluate proposals for original research. NSF accepts proposals in a manner that represents an open portal for new ideas in all fields, including interdisciplinary proposals. Whether they are submitted in response to open funding opportunities or for formal targeted solicitations, the proposals undergo merit review, with the subject matter experts (peers) assessing the intellectual merits and broader impacts of the proposed research. This merit review may take many forms, but all are designed to provide NSF program officers with the information they need to make award recommendations from among the (usually) large number of high-quality possibilities. The agency’s IT systems are critical to the process, facilitating the flow of proposals through the merit review, award, and oversight processes. External members of the research community periodically assess the quality of the merit review process as a means of accountability for NSF management and staff. Construction and operation of large scientific instruments and infrastructure efforts present added challenges. NSF requires extensive project execution plans, including detailed work scope, milestone schedules, and risk management; progress is monitored continually by NSF staff using extensive Earned Value Management (EVM) data, supplemented by in-depth external reviews conducted at least once a year.

Agency Priority Goals (APG) and Long-term Performance Goals

NSF has identified three agency priority goals for FY 2015: increase public access to NSF-funded peer-reviewed publications; improve agency and awardee efficiency by leveling the award of grants across the fiscal year; and build multi-institutional partnerships to increase the number of U.S. data scientists and the usefulness and ease of use of data infrastructure. Below are brief descriptions of the APGs. Complete descriptions can be found in NSF’s Annual Performance Plan.

Priority Goal 1: Increase public access to NSF-funded peer-reviewed publications

Progress in science and technology--and the associated benefits for the American people-thrives in an environment of open communication. Therefore, NSF seeks to enable increased access to the results of our investments in research by reducing barriers to communication of research results, while ensuring the integrity of the research record, protection of sensitive information, and consistency with existing laws. To this end and to fulfill the goals of the Office of Science and Technology Policy (OSTP) memorandum, Increasing Access to the Results of Federally Funded Scientific Research (February 22, 2013), NSF will articulate a strategy and develop plans that will require peer-reviewed publications that were supported by NSF funding to be publicly accessible.

Priority Goal 2: Improve the nation’s capacity in data science by investing in the development of human capital and infrastructure

The rapidity of data creation is outpacing the development of new tools to process rapidly growing, complex data sets. There is a pressing need both for new tools and for people with the expertise to develop and use those tools. NSF is in the beginning stages of creating mechanisms to catalyze the development of infrastructure and people to address the challenges posed by this new flood of data.

Using the agency’s ability to convene diverse sets of stakeholders, NSF will promote multi-stakeholder partnerships through supporting workshops and follow-on activities that bring together representatives of industry, academia, not-for-profit organizations, and other entities to address current and future big-data challenges. NSF will also leverage existing programs to support the creation of more researchers and students competent in the deep analytical and technical skills required to address those challenges. NSF intends to achieve this priority goal by September 30, 2015.

Priority Goal 3: Improve agency and awardee efficiency by leveling the award of grants across the fiscal year. NSF typically completes half of its nearly 20,000 funding actions on new and continuing awards in the fourth quarter due to the fact that almost 75 percent of proposals and funding requests are recommended for award during the last half of the fiscal year. This unbalanced award workload is partly a result of clustered proposal deadlines, and the NSF program staff practice of making funding decisions late in the fiscal year. Issuing such a high volume of awards in a compressed time period at the end of the fiscal year not only strains NSF’s workforce and other resources such as IT business systems and meeting space for conducting review panels, but it also increases risk and places added pressure on awardee capabilities coinciding with these peak workload periods. Achieving this priority goal will benefit the nation’s scientific research community, and support NSF’s strategic goal to excel as a Federal science agency. By September 30, 2015, NSF will meet targets to level distribution of awards across the fiscal year and subsequently improve awardee capacity to effectively manage research funding.

A Performance Plan has been developed in concert with this Strategic Plan. Each of the performance goals is associated with one or more strategic objective(s) in the draft Strategic Plan and will be reviewed annually in the new Strategic Reviews as well as in quarterly performance reviews. In FY 2015, NSF continues its new (FY 2014) performance framework for strategic monitoring of key program, infrastructure, and management investments (performance goals 1-3). Together with NSF’s enduring performance goal to make timely award decisions, these performance goals provide the foundation of NSF’s Performance plan. They were created to provide a means by which NSF leadership can provide strategic monitoring and oversight of progress being made on the Foundation’s most important activities: our priority program investments, research infrastructure investments, and key management initiatives. Below are brief descriptions of a few exemplar Performance Goals from NSF’s Annual Performance Plan.

1. We will meet critical targets for key program investments. Major NSF-wide investments will be strategically monitored using a set of common metrics and indicators.

2. We will ensure program integrity and exercise responsible stewardship of key research facilities. NSF has tracked the performance of its construction projects as a performance goal for over a decade.

3. NSF will use evidence-based reviews to guide management investments. This goal captures NSF’s commitment to using two government-wide processes, Portfolio Stat and HR Stat, which aim to ensure that decisions regarding resource investments are made through formal processes involving cross-agency decision-makers.

Appendix I -Strategies and Means

Introduction

NSF supports our mission by promoting the advance of science, preparing the STEM scholars of tomorrow, and continuously strengthening the Nation’s innovation ecosystem including the leveraging of partnership opportunities. This work is both exciting and challenging. NSF receives tens of thousands of proposals each year from a community comprising on the order of 1 million researchers, educators, and students in all areas of S&E. The agency can only fund a fraction of the meritorious proposals, and we must make awards that keep the entire S&E community vigorous and always expanding the limits of human knowledge. The review and decision process must be carried out with integrity and transparency in order to maintain trust that the resulting decisions are fair, forward-looking, and the optimum use of limited resources.

NSF’s enabling legislation provides much wisdom as to how to accomplish the agency’s mission; over the years, a robust menu of strategies and means have been developed by NSF staff under the oversight of the NSB and Congress, and in concert with the communities NSF serves. The result has been the development of a research enterprise that is unrivaled in the world. Efforts by other advanced countries to elevate their research capabilities to approach those of the U.S. are well-known. This favorable position notwithstanding, the global competition for S&E excellence is unrelenting, and the NSF is ever in search of means of continuous improvement. Within this context, the core strategies and means by which NSF addresses our mission are summarized below.

Core Strategies

• Envision the future of science and engineering through the eyes of the world’s front-line researchers.

• Maintain a balanced portfolio that provides opportunities for original research in all fields of S&E.

• Maintain a balanced portfolio with regard to the mechanisms of support.

• Selectively invest in targeted areas of research related to high priority societal needs.

• Foster the development of fundamental research and innovation.

• Integrate research and education to equip the continuous flow of STEM graduates with the latest ideas, technological know-how, and networks of contacts.

• Ensure diversity is at the forefront of all of NSF’s internal and external activities to develop the Nation’s intellectual potential.

• Maintain the public’s trust by operating with transparency, accountability, integrity, and ethical conduct.

• Maintain NSF’s reputation as the gold standard of merit review, while welcoming constructive criticism and seeking continuous improvement.

Means for Carrying Out Core Strategies

• Align budget and other major decisions with the Strategic Plan.

• Make difficult choices and set priorities in the context of strategic goals.

• Interact with NSF stakeholders proactively and efficiently.

• Partner with other science sponsors and professional organizations to achieve goals.

• Be responsive to proposals that pursue new ideas, including interdisciplinary proposals.

• Maintain an open and widely discussed strategic portfolio balance among S&E fields and among different modalities of research.

• Maintain extensive documentation, tracking, and public dissemination of performance indicators.

• Develop, where appropriate, quantitative or evidence-based evaluation of outcomes.

• Maintain up-to-date digital tools and business systems.

• Complement the expertise of NSF’s permanent staff with the knowledge and up-to-date experience of temporary staff drawn from among leading researchers and educators in the broader community.

Measuring Performance

The desire to improve the performance of Federal agencies and to realize the maximum value from public investments has led to an increasing emphasis on performance measurement of the outputs and outcomes of all agencies. This is reflected in the Government Performance and Results Act (GPRA) of 1993 and the GPRA Modernization Act of 2010. Evaluation methodologies are challenging for activities such as research that depend upon individual creativity aimed at outcomes that are unpredictable, or dependent on parallel research, or whose major impact occurs far outside the planning and management timeframes of the evaluation process. Moreover, intrinsic to the notion of supporting “high risk, high potential payoff” research is the fact that efforts to make transformational discoveries or advances meet with many dead ends and even failure.

The suite of assessment tools described below is designed to be suitable for the nature of the activity, taking into account the actual outputs and outcomes sought and the timeframes over which they are realized. For research activities aimed at results that are expected to be achieved on about the timescale of the award period, specific milestones and metrics can be identified. For activities that typically produce results on much longer timescales and for which ultimate impacts may occur beyond the planning horizon, assessments by independent experts in the field and benchmarking against analogous research activities in other laboratories and other countries are specified. Expert judgment does not lend itself to quantification, but can be used to assess the success or failure of certain categories of research. Benchmarking provides a way of determining the relative quality and health of a research portfolio measured against the world standard.

The core approaches and attempts to develop new methods for measuring the performance of the NSF portfolio are described below. The core methodologies are applied singly and in combination, as appropriate, to provide the most rigorous and independent measurement possible for each program. The development and testing of the new approaches will take place during the timeframe of this Strategic Plan.

Core Approaches for Measuring Performance

To gauge progress toward NSF’s strategic and performance goals, this plan establishes a framework for evaluation and assessment that builds upon longstanding NSF processes and also embraces new approaches.

Merit Review: All NSF funding decisions are based on merit review by subject matter experts. This is a leading indicator of performance, rather than an assessment of performance outputs/outcomes, but it is the primary assessment that is used to ensure the impact of NSF’s programs.

Committee of Visitor (COV) assessments: NSF Programs undergo an external assessment by a COV every three years. COVs assess the quality and integrity of the merit review process, program operations, and program-level technical and managerial matters pertaining to proposal decisions.

Portfolio Metrics: NSF maintains an extensive data base describing metrics that characterize our portfolio of awards. This includes dozens of parameters, from numbers of proposals/awards to numbers of investigators/students to diversity statistics to success rates. These metrics are good indicators for many of the performance goals.

Performance Metrics: Data on a number of performance indicators are also kept to monitor performance relevant to many of the performance goals, e.g., cost/schedule performance for facility construction, proposal processing time, hiring statistics, and performance of the awards and contracting process.

Independent Audits: Independent audits of NSF’s financial performance provide accountability to stakeholders; NSF has a clean audit record over the last 15 years. The Government Accountability Office (GAO) and the agency Inspector General (IG) also conduct audits of NSF activities.

Benchmarking: For programs whose ultimate outcomes occur over time frames longer than grant periods, NSF can benchmark against similar organizations worldwide to assure the strength of our programs as a measure of future performance. This assessment is conducted by external subject-matter experts with knowledge of the international context of a given program. It is based upon information provided by the program in the form of a self-assessment as well as the general knowledge of the assessment team.

Advisory Committees: The Federal Advisory Committee Act (FACA) provides a means to receive consensus advice from carefully vetted advisory committees of external experts. NSF has a FACA advisory committee for each directorate, and additional FACA committees advise two or more agencies on matters pertaining to select fields or crosscutting activity. These committees and their ad hoc subcommittees provide both advice and feedback from the community on the performance of NSF programs, all of which is posted on the NSF website.

Other Reviews and Evaluations: NSF pays careful attention to the advice offered in reports by the National Academies, national and international science organizations, professional societies, workshops, interagency working groups, and the NSB. NSF also employs a range of approaches to gain feedback on the efficiency and effectiveness of its systems and processes. Formal approaches include the testing of NSF IT systems against Federal standards such as those established by the Federal Information Security Management Act (FISMA). Other methods used include periodic surveys of applicants for NSF funding to gauge overall satisfaction, as well as surveys of NSF staff for feedback on NSF’s internal services, systems, and working environment.

New Approaches to Evaluation and Assessment

In addition to these longstanding inputs and processes, this Strategic Plan establishes a commitment to innovation and experimentation to strengthen the assessment process:

New Strategic Review (SR): A new assessment process has been established in response to the GPRA Modernization Act of 2010. Its implementation is described in OMB Circular A-11, Part 6. The new process, the Strategic Review (SR), is based on an agency-level self-assessment of each strategic objective in this plan. A report of the annual SR will be submitted to OMB as part of the annual budget process.

NSF Evaluation Initiative: The Administration has emphasized the need to use evidence-based assessment methodology in budget, management, and policy decisions to make government work effectively. In response, NSF has launched the NSF Evaluation Initiative to expand the agency’s capability in the area of evidence-based evaluation. The effort will be staffed to carry out an agency-wide coordination of data gathering and development of evaluation capabilities, and to conduct pilots for selected programs.

Appendix II -Contributing Programs for Strategic Objectives

The GPRA Modernization Act of 2010 requires each agency to develop an inventory of all federal programs. In response to this requirement, NSF categorized its federal programs by initial topic area of investment. This approach mirrors its budget structure, and the programs presented here are consistent with the program activity (PA) lines presented in the President’s Budget Appendix. This aligns to the way the agency executes its budget and is complementary with the expectations of external stakeholders. The ordering of this list follows the budget structure, with programs funded through the two program accounts (Research and Related Activities and Education and Human Resources) listed first, followed by Major Research Equipment and Facilities Construction, Agency Operations and Award Management, National Science Board, and Office of Inspector General.

The following tables, which constitute NSF’s federal program inventory, have been updated to map to the agency’s new 2014 –2018 Strategic Plan.

List of NSF Strategic Goals and Objectives 2014-2018

Strategic Goal G1: Transform the Frontiers of Science and Engineering
Strategic Objective 1 (G1/O1): Invest in fundamental research to ensure significant continuing advances across science, engineering, and education.
Strategic Objective 2 (G1/02): Integrate education and research to support development of a diverse STEM workforce with cutting-edge capabilities.
Strategic Objective 3 (G1/O3): Provide world-class research infrastructure to enable major scientific advances.
Strategic Goal G2: Stimulate Innovation and Address Societal Needs through Research and Education
Strategic Objective 1 (G2/O1): Strengthen the links between fundamental research and societal needs through investments and partnerships 
Strategic Objective 2 (G2/02): Build the capacity of the Nation to address societal challenges using a suite of formal, informal, and broadly available STEM educational mechanisms.
Strategic Goal G3: Excel as a Federal Science Agency
Strategic Objective 1 (G3/O1): Build an increasingly diverse, engaged, and high-performing workforce by fostering excellence in recruitment, training, leadership, and management of human capital.
Strategic Objective 2 (G3/O2): Use effective methods and innovative solutions to achieve excellence in accomplishing the agency’s mission. 

 

NSF Priority Goals

Goal Short Title Type of goal Goal Statement
Ensure Public Access to Publications. Agency Increase public access to NSF-funded peer-reviewed publications. By September 30, 2015, NSF-funded investigators will be able to deposit versions of their peer-reviewed articles in a repository that will make them available to the public.
Increase the Nation’s Data Science Capacity. Agency Improve the nation’s capacity in data science by investing in the development of human capital and infrastructure. By September 30, 2015, implement mechanisms to support the training and workforce development of future data scientists; increase the number of multi-stakeholder partnerships to address the nation’s big-data challenges; and increase investments in current and future data infrastructure, extending data-intensive science into more research communities.
Optimize the Award Process to Level Workload. Agency Improve agency and awardee efficiency by leveling award of grants across the fiscal year. By September 30, 2015, NSF will meet targets to level distribution of awards across the fiscal year and subsequently improve awardee capacity to effectively manage research funding.

List of Programs

Biological Sciences (BIO)

Program Title Biological Sciences (BIO)
Program Description This activity promotes scientific progress in biology through support of research on all levels, including molecules, cells, organisms, and ecosystems. This activity also supports a comprehensive research initiative on plant genomes, including research on economically significant crops. The Divisions within the Directorate for Biological Sciences are: • Biological Infrastructure (BIO/DBI) • Environmental Biology (BIO/DEB) • Emerging Frontiers (BIO/EF) • Integrative Organismal Systems (BIO/IOS) • Molecular and Cellular Biosciences (BIO/MCB) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/17_fy2014.pdf. (PDF)
Supported Strategic Goals Transform the Frontiers of Science and Engineering; Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/O1; G1/O2; G1/O3; G2/O1; G2/O2
Supported Agency Priority Goals Optimize the Award Process to Level Workload; Increase Data Scientists and Data Infrastructure; Ensure Public Access to Publications

 

Computer and Information Science and Engineering (CISE)

Program Title Computer and Information Science and Engineering (CISE)
Program Description This activity supports investigator initiated research in all areas of computer and information science and engineering, helps develop and maintain cutting-edge national computational and information infrastructure for research and education generally, and contributes to the education and training of the next generation of computer and computational engineers. The divisions within the Computer and Information Science and Engineering Directorate are: • Advanced Cyberinfrastructure (CISE/ACI) • Computing and Communication Foundations (CISE/CCF) • Computer and Network Systems (CISE/CNS) • Information & Intelligent Systems (CISE/IIS) • Information Technology Research (CISE/ITR) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/18_fy2014.pdf. (PDF)
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering Strategic; Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals Increase Data Scientists and Data Infrastructure; Optimize the Award Process to Level Workload; Ensure Public Access to Publications

 

Engineering (ENG)

Program Title Engineering (ENG)
Program Description Research supported by this activity aims to increase U.S. engineering capability and strength, and focus that capability and strength on areas that are relevant to national problems and long-term needs. This activity also includes small business innovation research. The Divisions within the Engineering Directorate are: • Chemical, Bioengineering, Environmental, and Transport Systems (ENG/CBET) • Civil, Mechanical and Manufacturing Innovation (ENG/CMMI) • Electrical, Communications and Cyber Systems (ENG/ECCS) • Engineering Education and Centers (ENG/EEC) • Emerging Frontiers in Research and Innovation (ENG/EFRI) • Industrial Innovation and Partnerships (ENG/IIP) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/19_fy2014.pdf. (PDF)
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering; Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals Increase Data Scientists and Data Infrastructure; Optimize the Award Process to Level Workload; Ensure Public Access to Publications

 

Geosciences (GEO)

Program Title Geosciences (GEO)
Program Description This activity supports research and associated infrastructure to advance knowledge of the properties and dynamics of the planet on which we live. Research includes understanding the causes and implications of climate change, as well as disruptive processes such as earthquakes and storms. Additionally, this activity supports Arctic and Antarctic research and operational science support and other related activities for United States polar research programs, including the funding to reimburse Federal agencies for logistical and other related activities supported by the United States Antarctic Program. The Divisions within the Geosciences Directorate are: • Atmospheric and Geospace Sciences (GEO/AGS) • Earth Sciences (GEO/EAR) • Integrative and Collaborative Education and Research (GEO/ICER) • Ocean Sciences (GEO/OCE) • Polar Programs (GEO/PLR) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/20_fy2014.pdf. (PDF)
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals Increase Data Scientists and Data Infrastructure; Optimize the Award Process to Level Workload; Ensure Public Access to Publications
DIRECTORATE FOR GEOSCIENCES (GEO)

Source: http://nsf.gov/about/budget/fy2014/pdf/20_fy2014.pdf. (PDF)

$1,393,860,000
+$72,720,000 / 5.5%

Table GEO Funding (Dollars in Millions)​
Program FY 2012 Actual FY 2012 Enacted/Annualized FY 2013 CR FY 2014 Request Change Over FY 2012 Enacted Amount Change Over FY 2012 Enacted Percent
Atmospheric & Geospace Sciences (AGS) $258.65 $258.66 $266.61 $7.95 3.1%
Earth Sciences (EAR) 183.43 183.50 191.20 7.70 4.2%
Integrative & Collaborative Education & Research (ICER) 91.30 91.21 93.71 2.50 2.7%
Ocean Sciences (OCE) 351.79 351.90 377.44 25.54 7.3%
Polar Programs (PLR) 436.20 435.87 464.90 29.03 6.7%
U.S. Antarctic Logistical Support [67.52] [67.52] [67.52] - -
Total, GEO $1,321.37 $1,321.14 $1,393.86 $72.72 5.5%


Totals may not add due to rounding.  

About GEO

GEO supports basic research that advances the frontiers of knowledge and drives technological innovation while improving our understanding of the many processes that affect the global environment. These processes include the role of the atmosphere and oceans in climate, the planetary water cycle, and ocean acidification. Support is provided for interdisciplinary studies that contribute directly to national research priorities such as: understanding, adapting to, and mitigating the impacts of global change; developing and deploying integrated ocean observing capabilities to support ecosystem-based management; and understanding future availability of fresh water. Lives are saved and property is preserved through better prediction and understanding of natural environmental hazards, such as earthquakes, tornados, hurricanes, tsunamis, drought, and solar storms. Basic research supported by GEO enables preparation for and subsequent mitigation of, or adaptation to, the effects of these and other disruptive natural events.

GEO supports research spanning the Atmospheric, Earth, Ocean, and Polar sciences. As the primary U.S. supporter of fundamental research in the polar regions, GEO provides interagency leadership for U.S. polar activities. In the Arctic, NSF helps coordinate research planning as directed by the Arctic Research Policy Act of 1984. The NSF Director chairs the Interagency Arctic Research Policy Committee created for this purpose, which is now a component of the President’s National Science and Technology Council. In the Antarctic, per Presidential Memorandum 6646, GEO manages all U.S. activities as a single, integrated program, making Antarctic research possible for scientists supported by NSF and by other U.S. agencies. The latter include the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA), the U.S. Geological Survey (USGS), the Smithsonian Institution, and the Department of Energy. The U.S. Antarctic Program research activity supported by NSF also supports leadership by the U.S. Department of State in the governance of the continent and Southern Ocean under the aegis of the Antarctic Treaty.

GEO provides about 61 percent of the federal funding for basic research at academic institutions in the geosciences.
Directorate for Geosciences

Figure GEO Subactivity Funding (Dollars in Millions)

GEOSubactivityFundingFigure1.png

FY 2014 Summary by Division
  • AGS’s FY 2014 Budget Request is focused on enhancing support of the NSF-wide Science, Engineering, and Education for Sustainability (SEES) investment, maintaining support for disciplinary and interdisciplinary research activities, and support of the observational infrastructure required to conduct modern research, including overseeing operation of the NCAR-Wyoming supercomputer center.
  • EAR’s FY 2014 Budget Request is focused on enhancing support of the NSF-wide SEES investment, maintaining support for disciplinary and interdisciplinary research activities, and support of the observational infrastructure required to conduct modern research. A realignment of infrastructure support within EAR begins in FY 2013, and while overall support levels remain relatively steady, previously-existing facilities are being integrated into new activities.
  • ICER’s FY 2014 Budget Request will support emerging priority areas, such as Cyberinfrastructure Framework for 21st Century Science, Engineering, and Education (CIF21), while enhancing support of the NSF-wide SEES investment. Significant shifts in GEO’s undergraduate education portfolio will occur, with investments moving toward the consolidated NSF-wide activity, Catalyzing Advances in Undergraduate STEM Education (CAUSE), which is led by the Directorate for Education and Human Resources.
  • OCE’s FY 2014 Budget Request will enhance support of the NSF-wide SEES investments where emphasis will be on understanding coastal systems and mitigating the impacts of disasters. OCE is strongly supporting the President’s Executive Order establishing a National Ocean Policy (NOP) 1 through enablement of research, education, and infrastructure. OCE will continue to invest in research infrastructure, as well as develop and evaluate plans for potential new Regional Class Research Vessels. Support for the Ocean Observatories Initiative (OOI) increases as operations ramp up.
  • PLR's FY 2014 Budget Request is focused on maintaining strong disciplinary programs, targeted basic research in cross-foundation and interagency priorities, and supporting and improving the efficiency of critical facilities that enable research in both polar regions, most notably support for implementing recommendations of the U.S. Antarctic Program Blue Ribbon Panel (BRP). 2
Major Investments
Table GEO Major Investments (Dollars in Millions)
Area of Investment FY 2012 Actual FY 2012 Enacted/Annualized FY 2013 CR FY 2014 Request Change OverFY 2012 Enacted Amount Change OverFY 2012 Enacted Percent
CAREER $15.60 $13.08 $14.18 $1.10 8.4%
CIF21 4.49 8.00 16.50 8.50 106.3%
I-Corps 0.19 0.25 1.75 1.50 600.0%
INSPIRE 0.86 2.00 6.00 4.00 200.0%
SEES 58.75 58.75 86.27 27.52 46.8%

Major investments may have funding overlap and thus should not be summed.

  • Faculty Early Career Development Program (CAREER): This Foundation-wide activity offers the National Science Foundation's most prestigious awards in support of junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education, and the integration of education and research within the context of the mission of their organizations. Approximately 30 awards will be made in FY 2014.
  • CIF21: GEO support for the NSF-wide CIF21 investment ($16.50 million) will predominantly support development of EarthCube. A partnership with the Directorate for Computer and Information Science and Engineering (CISE), EarthCube seeks transformative concepts and approaches to create an integrated data management infrastructure across the geosciences. In addition, proposals submitted to CIF21 competitions will be considered for support along with other CIF21 activities across GEO.
  • NSF Innovation Corps (I-Corps): GEO support of I-Corps gives project teams access to resources to help determine the readiness of technology developed by previously-funded or currently-funded NSF projects to transition from idea to product.
  • Integrated NSF Support Promoting Interdisciplinary Research and Education (INSPIRE): Intended to encourage cross-disciplinary science, INSPIRE helps to break down disciplinary barriers and encourages program managers to use new tools, collaboration modes, and techniques in the merit-review process to widen the pool of prospective discoveries that may be hidden from or circumvented by traditional means. Leveraged by centralized support, scientists will utilize INSPIRE to pursue novel interdisciplinary research at the forefront of the geosciences.
  • SEES: GEO supports a portfolio of activities that highlight NSF's unique role in helping society address the challenges of achieving sustainability. In FY 2014, existing areas, especially coastal and hazards-related research, will be strengthened and a thrust on sustainable materials will be supported by GEO.
GEO Funding for Centers Programs and Facilities
Table GEO Funding for Centers Programs (Dollars in Millions)
Center FY 2012 Actual FY 2012 Enacted/ Annualized FY 2013 CR FY 2014 Request Change Over FY 2012 Enacted Amount Change Over FY 2012 Enacted Percent
Centers Programs Total $17.66 $17.70 $16.11 -$1.59 -9.0%
Nanoscale Science & Engineering Centers (ICER) 0.25 0.25 - -0.25 -100.0%
Science & Technology Centers (AGS, OCE, PLR) 17.41 17.45 16.11 -1.34 -7.7%

Totals may not add due to rounding

For detailed information on individual centers, please see the NSF-Wide Investments chapter.

  • Nanoscale Science and Engineering Centers (NSEC): NSEC support will decrease by $250,000 as the award to the single center partially supported by GEO concludes.
  • Science and Technology Centers (STCs): GEO supports a total of four Science and Technology Centers through funding from three divisions. FY 2014 completes the planned sunsetting of the Center for the Remote Sensing of Ice Sheets (CRESIS) as the center enters its final year of support.
Table GEO Funding for Facilities (Dollars in Millions)
Facilities FY 2012 Actual FY 2012 Enacted/ Annualized FY 2013 CR FY 2014 Request Change Over FY 2012 Enacted Amount Change Over FY 2012 Enacted Percent
Facilities Total $614.79 $591.53 $647.66 $56.13 9.5%
Academic Research Fleet (OCE) 92.96 78.75 86.00 7.25 9.2%
Arctic Research Support & Logistics (PLR) 42.08 43.54 44.00 0.46 1.1%
Arecibo Observatory (AGS) 3.63 3.20 3.50 0.30 9.4%
Geodesy for the Advancement of Geoscience & EarthScope (EAR) 11.92 13.18 12.70 -0.48 -3.6%
Cube Neutrino Observatory (PLR) 3.45 3.45 3.45 - -
International Ocean Discovery Program (OCE) 51.68 44.40 50.00 5.60 12.6%
National Nanotechnology Infrastructure Network (ICER 0.60 0.60 0.30 -0.30 -50.0%
National Center for Atmospheric Research (AGS) 103.00 98.60 99.00 0.40 0.4%
Ocean Observatories Initiative (OCE) 26.80 26.80 52.80 26.00 97.0%
Seismological Facilities for the Advancement of Geosciences & EarthScope (EAR) 26.12 26.76 25.70 -1.06 -4.0%
U.S. Antarctic Facilities & Logistics (PLR) 185.02 184.73 202.69 17.96 9.7%
U.S. Antarctic Logistical Support (PLR) 67.52 67.52 67.52 - -

Totals may not add due to rounding.

For detailed information on individual facilities, please see the Facilities chapter.

GEO has increased operations and maintenance budgets for facilities in order to provide the infrastructure needed by the broad geoscience research community and address BRP recommendations related to Antarctic infrastructure. Notable items include:

  • Support for the Academic Research Fleet increase reflects the start of operation of the R/V SIKULIAQ and increased fuel costs.
  • The International Ocean Discovery program funds increase enables a longer operational schedule for the vessel.
  • Operational support for the Ocean Observatories Initiative (OOI) increases as the observatory procures spare parts and operations ramp up.
  • U.S. Antarctic Facilities and Logistics, responding to BRP recommendations, will provide a significant increase to improve operational efficiency, reduce future operations costs, and increase safety of researchers and contractors working in Antarctica.
Summary and Funding Profile

In FY 2014, the number of research grant proposals is expected to be about 5,000 and GEO expects to award about 1,400 research grants. Average annual award size and duration are not expected to materially fluctuate in FY 2012 through FY 2014.

Operations and maintenance (O&M) funding for GEO-supported user facilities totals $646.36 million, and comprises 46 percent of GEO’s FY 2014 Request. GEO has increased operations budgets for facilities to maintain current operational capacity, keep pace with needed increases for OOI O&M, and respond to BRP recommendations to increase efficiency of activities in Antarctica.

Table GEO Funding Profile
Statistics FY 2012 Actual Estimate FY 2012 Enacted/Annualized FY 2013 CR Estimate 1 FY 2014 Estimate
Statistics for Competitive Awards      
Number of Proposals 5,246 5,774 5,500
Number of New Awards 1,640 1,788 1,700
Funding Rate 31% 31% 31%
Statistics for Research Grants      
Number of Research Grant Proposals 4,716 5,140 5,000
Number of Research Grants 1,361 1,560 1,400
Funding Rate 29% 30% 28%
Median Annualized Award Size $129,866 $132,750 $136,000
Average Annualized Award Size $170,566 $167,867 $171,000
Average Award Duration, in years 2.6 2.8 2.8

1 Award estimates shown for FY 2013, such as numbers of awards and size/duration, are based upon the FY 2012 Enacted level.

Program Monitoring and Evaluation

External Program Evaluations and Studies:

  • In FY 2012, NSF and the Office of Science and Technology Policy received the report of the U.S. Antarctic Blue Ribbon Panel, More and Better Science in Antarctica Through Increased Logistical Effectiveness. 3 The FY 2014 Request allocates funding to begin implementing the recommendations of this group.
  • Two reports from the National Research Council were received in 2012: New Research Opportunities in the Earth Sciences, 4 which identifies emerging research directions for consideration by NSF; and Challenges and Opportunities in Hydrologic Sciences, 5 which highlights research opportunities to help us better understand the role of water in the Earth system.

Workshops and Reports:

  • Many workshops are convened each year to allow the research communities supported by GEO to articulate priorities. Of particular note in 2012 was an opportunity for considerable community planning around GEO’s EarthCube activity. This ambitious program seeks to develop the cyberinfrastructure to knit together geoscience data from a variety of disciplines, and is GEO’s contribution to NSF’s CIF21 investment, and is central to GEO’s strategy to capitalize on Big Data.

Committees of Visitors (COV):

  • In 2012, COVs reviewed the NCAR and Facilities Section within AGS, 6 and Ocean Research and Education programs in OCE. 7 The COVs’ reports were presented to the GEO Advisory Committee, which convened in April and October of 2012.
  • In 2014, COVs will be held to review the Atmosphere Section within AGS and Instrumentation and Facilities programs in EAR.

The Performance chapter provides details regarding the periodic reviews of programs and portfolios of programs by external Committees of Visitors and directorate Advisory Committees. Please see this chapter for additional information.

Table Number of People Involved in GEO Activities
Category FY 2012 Actual Estimate FY 2013 Estimate FY 2014 Estimate
Senior Researchers 6,924 7,700 7,300
Other Professionals 3,288 3,700 3,500
Postdoctorates 675 800 700
Graduate Students 2,947 3,100 3,100
Undergraduate Students 2,531 2,300 2,700
Total Number of People 16,364 17,600 17,300

          
 

DIVISION OF ATMOSPHERIC AND GEOSPACE SCIENCES (AGS)

$266,610,000

+$7,950,000 / 3.1%

Table AGS Funding(Dollars in Millions)​
Category FY 2012 Actual FY 2012 Enacted/ Annualized FY 2013 CR FY 2014Request Change Over FY 2012 Enacted Amount Change Over FY 2012 Enacted Percent
Total, AGS $258.65 $258.66 $266.61 $7.95 3.1%
Research 121.11 124.28 139.65 15.37 12.4%
CAREER 6.45 5.48 5.90 0.42 7.7%
Centers Funding (total) 4.00 4.00 4.00 - -
STC: Multi-Scale Modeling of Atmospheric Processes 4.00 4.00 4.00 - -
Education 3.78 1.96 1.70 -0.26 -13.3%
Infrastructure 133.76 132.42 125.26 -7.16 -5.4%
Arecibo Observatory 3.63 3.20 3.50 0.30 9.4%
National Center for Atmospheric Research (NCAR) 103.00 98.60 99.00 0.40 0.4%
Research Resources 27.13 30.62 22.76 -7.86 -25.7%

Totals may not add due to rounding.

The mission of AGS is to extend the intellectual frontiers in atmospheric and geospace sciences by making investments in fundamental research, technology development, and education that enable discoveries, nurture a vibrant, diverse scientific workforce, and help attain a prosperous and sustainable future. AGS supports activities to further our understanding of the physics, chemistry, and dynamics of Earth’s atmosphere, from the Earth’s surface to the Sun, on timescales ranging from minutes to millennia. AGS provides support for: 1) basic science projects and 2) the acquisition, maintenance, and operation of observational and cyberinfrastructure facilities and services that enable modern-day atmospheric and geospace science research activities. Although the majority of AGS support is through individual investigator merit-reviewed multi-year grants, the division also supports small-scale, limited-duration exploratory research projects; collaborative or multi-investigator group projects focusing on a particular problem, subject, or activity; large center or center-like projects; and funding for the research conducted at facilities provided by the NSF-supported National Center for Atmospheric Research (NCAR), which extends and enhances research at universities. More information on NCAR is available in the Facilities chapter. The division will increase support in key areas of fundamental atmospheric and geospace science, including dynamics and predictability of high-impact atmospheric and space weather hazards, and support for research concerning the complex and dynamic interactions among natural and human-driven processes in coastal areas through its contributions to NSF’s Science, Engineering, and Education for Sustainability (SEES) activities of Hazards SEES and Coastal SEES.

Recognizing the close interplay between the division’s support for science and the provision of facilities to support that science, AGS seeks a balance between its support for science and facilities. Approximately 50 percent of the annual budget of AGS is used to support observational and computational facilities, as well as the Arecibo Observatory and the Federally Funded Research and Development Center, NCAR. The Arecibo Observatory is also supported by Division of Astronomical Sciences in the Directorate for Mathematical and Physical Sciences. The remaining 50 percent of the AGS budget is for individual, small group, and center-like research grants. In general, of the 50 percent of the AGS budget available for research grants, about 50 percent (or 25 percent of the total AGS portfolio) is available for new research grants; the remaining 50 percent funds continuing grants made in previous years.

FY 2014 Summary

All funding decreases/increases represent change over the FY 2012 Enacted level.

Research

  • Support for early-career researchers is an AGS priority. The division increases its support for CAREER grants by $420,000 to a total of $5.90 million.
  • The Space Weather Research program supports the development of integrative space science models, extending network observing capabilities, with the goal of meeting societal needs for improved monitoring and advancement of predictions of space weather phenomena and effects. Support for this program is increased by $1.0 million, to a total of $7.23 million.
  • In FY 2014 AGS will contribute $19.40 million in support of the cross-directorate research opportunities within the Science, Engineering, and Education for Sustainability (SEES) portfolio. AGS will support Hazards SEES at $5.40 million and Coastal SEES at $3.0 million. For AGS, the overarching goal of Hazards SEES is to catalyze basic research in hazard-related science to support a broad spectrum of research into the improved understanding and prediction of atmospheric and space weather hazards. Among the goals of Coastal SEES are enabling place-based, system-level understanding of vulnerable coastal systems; yielding outcomes with quantitative predictive value; and identifying pathways to enhance coastal resilience. AGS support of SEES activities Decadal and Regional Climate Prediction using Earth System Models (EaSM) will continue at $10.0 million, while support for Water Sustainability and Climate will be supported at a level of $1.0 million.
  • The Center for Multi-scale Modeling of Atmospheric Processes, an NSF Science and Technology Center (STC) initiated in FY 2006, will be maintained at $4.0 million.

Education

  • AGS funding for the Research Experiences for Undergraduates (REU) Sites and Supplements program decreases $250,000 below the FY 2012 Enacted level to $1.0 million, reflecting increased support for CAREER. $180,000 of FY 2014 funding will support enhanced research experiences for students in their first two years of college, as recommended by the President’s Council of Advisors on Science and Technology (PCAST) in their report, Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics.
  • AGS supports other education activities including AGS Postdoctoral Fellows. The total AGS education portfolio is $1.70 million in FY 2014.

Infrastructure

  • Funding for the Arecibo Observatory will increase to $3.50 million (+$300,000).
  • Support for the National Center for Atmospheric Research (NCAR) is increased by $400,000, to a total of $99.0 million. This level of support maintains key community research infrastructure operated by NCAR.
  • Support for Research Resources is reduced by $7.86 million, to a total of $22.76 million. This level of support represents an increased focus on research using existing instrumentation rather than instrumentation development.
DIVISION OF EARTH SCIENCES (EAR)

$191,200,000
+$7,700,000 / 4.2%

Table EAR Funding(Dollars in Millions)
Category FY 2012 Actual FY 2012 Enacted/ Annualized FY 2013 CR FY 2014 Request Change Over FY 2012 Enacted Amount Change Over FY 2012 Enacted Percent
Total, EAR $183.43 $183.50 $191.20 $7.70 4.2%
Research 119.45 117.09 125.75 8.66 7.4%
CAREER 6.18 4.80 5.07 0.27 5.6%
Education 4.42 4.93 5.48 0.55 11.2%
Infrastructure 59.57 61.48 59.97 -1.51 -2.5%
Geodesy for the Advancement of Geoscience & EarthScope (GAGE) 11.92 13.18 12.70 -0.48 -3.6%
Seismological Facilities for the Advancement of Geosciences & EarthScope (SAGE) 26.12 26.76 25.70 -1.06 -4.0%
Research Resources 21.54 21.54 21.57 0.03 0.1%

Totals may not add due to rounding.

EAR supports fundamental research into the structure, composition, and evolution of the Earth, and the life it has sustained over the four and a half billion years of Earth history. The results of this research will lead to a better understanding of Earth's changing environment (past, present, and future); the natural distribution of its mineral, water, biota, and energy resources; and provide methods for predicting and mitigating the effects of geologic hazards, such as earthquakes, volcanic eruptions, floods, and landslides.

Through its Surface Earth Processes Section, EAR supports research in geomorphology and land use, hydrologic science, geobiology and low temperature geochemistry, and sedimentary geology and paleobiology. The division’s Deep Earth Processes Section maintains programs in geophysics, tectonics, petrology and geochemistry, and integrated earth systems. In addition to these fundamental research programs, EAR has an Instrumentation and Facilities program that supports community-based, shared-use facilities; and the acquisition and development of instrumentation by individual investigators; EarthScope, a $200.0 million facility and science program focused on studying the structure and tectonics of the North American continent; and an education program that funds a number of activities to attract and support students and young investigators to the field of Earth Science.

Approximately 68 percent of EAR’s budget is used to support individuals and small groups of researchers, while about 32 percent of the budget goes to instrumentation and facilities. In general, 36 percent of EAR’s portfolio is available for new research grants. The remaining 64 percent funds continuing grants made in previous years.

FY 2014 Summary

All funding decreases/increases represent change over the FY 2012 Enacted level.

Research

  • EAR will continue its participation in SEES with $8.90 million for Water Sustainability and Climate, an increase of $1.90 million. In FY 2014, EAR will also participate in SEES SusCHEM - Sustainable Chemistry, Engineering, and Materials at $1.50 million, Hazards SEES at $4.50 million, and Coastal SEES at $3.50 million.
  • In FY 2014 EAR’s support for NSF’s INSPIRE Track 1 investment will be $1.0 million.
  • CAREER funding will be increased by $270,000 to a total of $5.07 million, reflecting EAR’s continued commitment to supporting early career investigators.

Education

  • EAR funding for the Research Experiences for Undergraduates (REU) Sites and Supplements program increases $250,000 over the FY 2012 Enacted level to $1.34 million. $180,000 of the increased FY 2014 funding will support enhanced research experiences for students in their first two years of college, as recommended by the President’s Council of Advisors on Science and Technology (PCAST) in their report, Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics.
  • Support for EAR Postdoctoral Fellowships will increase by $300,000 to $1.70 million reflecting EAR’s commitment to workforce development.

Infrastructure

  • Beginning in FY 2014, the separate seismic and geodetic facilities operated by the Incorporated Research Institutions for Seismology (IRIS), UNAVCO and EarthScope will be integrated into two facilities: (1) Seismological Facilities for the Advancement of Geosciences and EarthScope (SAGE), operated by IRIS, will be funded at $25.70 million; and (2) Geodetic Facilities for the Advancement of Geoscience and EarthScope (GAGE), operated by UNAVCO, will be funded at $12.70 million.
DIVISION OF INTEGRATIVE AND COLLABORATIVE EDUCATION AND RESEARCH (ICER)

$93,710,000

+$2,500,000 / 2.7%

Table ICER Funding(Dollars in Millions)
Category FY 2012 Actual FY 2012 Enacted/ Annualized FY 2013 CR FY 2014Request Change OverFY 2012 Enacted Amount Change OverFY 2012 Enacted Percent
Total, ICER $91.30 $91.21 $93.71 $2.50 2.7%
Research 60.91 72.04 77.35 5.31 7.4%
Centers Funding (total) 0.25 0.25 - -0.25 -100.0%
Nanoscale Science & Engineering Centers 0.25 0.25 - -0.25 -100.0%
Education 29.79 18.57 16.06 -2.51 -13.5%
Infrastructure 0.60 0.60 0.30 -0.30 -50.0%
National Nanotechnology Infrastructure Network (NNIN) 0.60 0.60 0.30 -0.30 -50.0%

Totals may not add due to rounding.

 ICER supports novel, complex, or partnership projects in both research and education. These investments cut across traditional boundaries within the geosciences, encouraging interdisciplinary activities and responding directly to critical needs of the entire geoscience community. ICER’s principal goals are to develop innovative means to initiate and support geoscience education, attract underrepresented groups to careers in the geosciences, foster the interchange of scientific information nationally and internationally, and to join with other parts of NSF in major integrative research and education efforts. In FY 2014, the division will make strategic investments in climate research, international activities, education, diversity, and human resource development.

In general, 38 percent of the ICER portfolio is available for new research grants. The remaining 62 percent funds continuing grants made in previous years.

FY 2014 Summary

All funding decreases/increases represent change over the FY 2012 Enacted level.

Research

  • ICER will support SEES activities totaling $17.25 million in FY 2014. Supported activities will lay the foundation for technologies to mitigate, and adapt to, environmental change that threatens sustainability, with an emphasis in FY 2014 on vulnerable regions in the Arctic and along coasts. ICER will participate in activities to establish a robust suite of Sustainability Research Networks, and in a program to identify clean energy sources and the impact of using those sources on the environment and society.
  • ICER supports a varied portfolio of international collaborative activities. In FY 2014, this will total $6.50 million, and emphasize collaborative research across the Americas and specific research activities sponsored by the International Group of Funding Agencies for Global Change Research.

Education

  • In FY 2014, GEO is consolidating geoscience education and diversity support into the NSF-wide CAUSE initiative. ICER houses GEO’s support for the CAUSE activity, which totals $10.90 million in FY 2014. No other education efforts are supported by ICER in FY 2014.

Infrastructure

  • ICER provides GEO’s contribution to the National Nanotechnology Infrastructure Network. In FY 2014, this support decreases by $300,000, to a total of $300,000, as a previously supported center refocuses its emphasis upon renewal.
DIVISION OF OCEAN SCIENCES (OCE)

$377,440,000
+$25,540,000 / 7.3%

Table​ OCE Funding (Dollars in Millions)
Category FY 2012 Actual FY 2012 Enacted/ Annualized FY 2013 CR FY 2014Request Change OverFY 2012 Enacted Amount Change OverFY 2012 Enacted Percent
Total, OCE $351.79 $351.90 $377.44 $25.54 7.3%
Research 162.47 183.21 175.69 -7.52 -4.1%
CAREER 1.64 2.80 3.21 0.41 14.6%
Centers Funding (total) 8.96 9.00 9.00 - -
STC: Coastal Margin Observation & Prediction 4.00 4.00 4.00 - -
STC: Dark Energy Biosphere Investigations 4.96 5.00 5.00 - -
Education 9.00 7.94 6.12 -1.82 -22.9%
Infrastructure 180.32 160.75 195.63 34.88 21.7%
Academic Research Fleet 92.96 76.75 85.00 8.25 10.7%
International Ocean Discovery Program (IODP) 51.68 44.40 50.00 5.60 12.6%
Ocean Observatories Initiative (OOI) 26.80 26.80 52.80 26.00 97.0%
Research Resources 8.87 10.80 6.83 -3.97 -36.8%
Facilities Pre-Construction Planning (total) - 2.00 1.00 -1.00 -50.0%
Regional Class Research Vessels (RCRV) - 2.00 1.00 -1.00 -50.0%

 Totals may not add due to rounding

Research, education, and infrastructure funded by OCE address the central role of the oceans in a changing Earth and as a national strategic resource, as recognized in the President’s 2010 Executive Order establishing a National Ocean Policy (NOP) and creating a National Ocean Council (NOC) to implement the policy. OCE supports interdisciplinary research to better understand changing ocean circulation and other physical parameters, biodiversity and the dynamics of marine organisms and ecosystems, and changing ocean chemistry as exemplified by ocean acidification. OCE also supports research on the geology of the ocean margins and sub-seafloor to investigate past ocean and climate conditions, stability of methane hydrates, natural hazards associated with earthquakes and volcanic eruptions, and microbial life deep below the seafloor. Ocean education emphasizes undergraduate REU programs and the interdisciplinary nature of ocean sciences. Since ocean science requires access to the sea, OCE supports research vessels, deep submergence capability including submersibles and autonomous vehicles, and technologically advanced sensors and instrumentation. In FY 2014, research emphases in OCE will be guided by the recently issued report “Science for an Ocean Nation: An Update of the Ocean Research Priorities Plan”. 8 This report identifies national research priorities in key areas of interaction between society and the ocean. These priorities include improved understanding of marine ecosystems, marine biodiversity, the impact of increased atmospheric carbon dioxide on ocean acidification, the ocean’s role in climate change, ocean observing, changing conditions in the Arctic, hazards and extreme events, and the enhancement of infrastructure to support ocean and coastal research.
In general, 30 percent of the OCE portfolio is available for new research grants. The remaining 70 percent funds continuing grants made in previous years. Approximately 52 percent of the overall budget supports facilities and infrastructure.

FY 2014 Summary

Research

  • • OCE’s research funding decreases by $7.52 million from the FY 2012 Enacted level, primarily due to increased infrastructure funding. Included here are a $2.0 million investment in INSPIRE and a $1.60 million increase in CAREER. OCE will continue support for two STCs and the Long Term Ecological Research program at previous levels. OCE will give high priority to research themes that emerge from both the NOP and the Ocean Research Priorities Plan.
  • • OCE will invest $19.50 million in SEES activities. This includes $8.50 million in Coastal SEES, $2.50 million in Hazards SEES, $6.0 million in Ocean Acidification, and $2.50 million in Dimensions of Biodiversity.
  • • OCE will also continue its partnership on the theme of Oceans and Human Health with the National Institute of Environmental Health Sciences (NIEHS).

Education

  • Funding for educational activities will decrease by $1.82 million, reflecting the close-out of the Centers for Ocean Science Education Excellence (COSEE) program during FY 2014.
  • OCE support for REU will increase by $1.12 million over the FY 2012 Enacted level. $220,000 of the increased FY 2014 funding will support enhanced research experiences for students in their first two years of college, as recommended by the President’s Council of Advisors on Science and Technology (PCAST) in their report, Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics.
  • Funding will continue for a program initiated in FY 2012, the OCE Postdoctoral Fellowship Program. This program aims to broaden participation in the ocean sciences through fellowships to enhance opportunities for women and minority scientists.

Infrastructure

  • Continuation of investment ($1.0 million) in planning and design for fleet renewal with construction of up to three Regional Class Research Vessels (RCRVs), as a candidate Major Research Equipment and Facilities Construction (MREFC) project.
  • A $26.0 million increase for continued implementation of the Ocean Observatories Initiative (OOI), bringing the total for operations and maintenance to $52.80 million in FY 2014. These increased funds support the transition from the design/build phase to deployment and testing of the network leading to full commissioning and operation in FY 2015.
  • Continued support ($50.0 million) is requested for FY 2014 operations of the drilling vessel, JOIDES Resolution as part of the U.S. contribution to the International Ocean Discovery Program (IODP). This level is $5.60 million above the FY 2012 Enacted level. In FY 2012, the National Science Board approved a one-year extension of the current program into FY 2014 to allow time for a competition, currently underway, to award a new cooperative agreement to continue operations of the JOIDES Resolution. In FY 2013, NSF investments in IODP have been leveraged by support from international partners and allowing sub-leasing of the vessel to industry by the current contractor. Such leveraging is expected to continue in FY 2014 and beyond.
  • The cost of Academic Research Fleet operations will increase, reflecting the general upward trend in fuel costs and other expenses.
DIVISION OF POLAR PROGRAMS (PLR)

$464,900,000
+$29,030,000 / 6.7%

Table​ PLR Funding (Dollars in Millions)
Category FY 2012 Actual FY 2012 Enacted/ Annualized FY 2013 CR FY 2014Request Change OverFY 2012 Enacted Amount Change OverFY 2012 Enacted Percent
Total, PLR $436.20 $435.87 $464.90 $29.03 6.7%
Research 129.36 127.71 139.34 11.63 9.1%
CAREER 1.34 - - - N/A
Centers Funding (total) 4.45 3.77 3.11 -0.66 -17.5%
STC: Center for Remote Sensing of Ice Sheets 4.45 3.77 3.11 -0.66 -17.5%
Education 2.45 2.55 0.80 -1.75 -68.6%
Infrastructure 304.39 305.61 324.76 19.15 6.3%
Arctic Research Support & Logistics 42.08 43.54 44.00 0.46 1.1%
IceCube Neutrino Observatory (IceCube) 3.45 3.45 3.45 - -
U.S. Antarctic Facilities & Logistics -185.02 184.73 202.69 17.96 9.7%
U.S. Antarctic Logistical Support 67.52 67.52 67.52 - -
Polar Environment, Safety, and Health -6.31 6.37 7.10 0.73 11.5%


Totals may not add due to rounding.PLR Funding(Dollars in Millions)

The Division of Polar Programs (PLR) is the primary U.S. supporter of, and serves NSF interagency leadership responsibilities for, fundamental research in the polar regions. The Arctic Sciences section supports research in social, earth systems, and a broad range of natural sciences; its Research Support & Logistics program is driven by and responds to research by assisting researchers with access to the Arctic and for engagement in planning and sharing of results with local Arctic communities. Antarctic Sciences funds research for which access to Antarctica is essential to advancing the scientific frontiers, including research in a broad array of geo- and bio-sciences, such as earth system science, as well as space and astrophysical sciences that can only be achieved or are best achieved with work performed in Antarctica and the Southern Ocean. Antarctic Infrastructure and Logistics enables research in Antarctica on behalf of the U.S. Government through a network of stations, labs, equipment, and logistical resources. The Environment, Safety, and Health section provides oversight for the environmental, safety, and health aspects of research and operations conducted in polar regions.

PLR’s FY 2014 Request reflects three key priorities: (1) maintaining strong disciplinary programs that provide a base for investments in cross-disciplinary science programs; (2) focusing basic research on cross-foundation (e.g., Science, Engineering, and Education for Sustainability (SEES), Cyberinfrastructure Framework for 21st Century Science, Engineering, and Education (CIF21), GEOTRACES), and interagency priorities; and (3) supporting and improving the efficiency of critical facilities that enable research in both polar regions. For Antarctica, the primary objective is to initiate a concerted multi-year commitment toward more effective and lower-cost science support as recommended by the U.S. Antarctic Program (USAP) Blue Ribbon Panel (BRP) report, More and Better Science in Antarctica through Increased Logistical Effectiveness. 9 NSF issued a formal response to this report in March 2013. 10 This entails coordinated commitments by the Antarctic Sciences and Antarctic Infrastructure and Logistics sections, as well as the Environment, Safety, and Health section. Emphases include safety and health improvements, investments with positive net present value, and facilities renewal at McMurdo and Palmer stations. Additionally, the Antarctic science community will be asked to plan and execute more effective observational approaches as outlined in the 2011 National Research Council (NRC) report, Future Science Opportunities in Antarctica and the Southern Ocean, 11 and reaffirmed by the BRP. For the Arctic, shared cross-directorate basic research objectives, the recently released Interagency Arctic Research Policy Committee’s (IARPC) Arctic Research Plan FY 2013-2017 (Five-Year Plan) 12 and the National Ocean Policy (NOP) Implementation Strategy 13 inform science investment priorities.
In general, 40 percent of the PLR research portfolio is available for new grants and 60 percent for continuing grants. Approximately 70 percent of the overall budget supports facilities and logistics.

FY 2014 Summary

All funding decreases/increases represent change over the FY 2012 Enacted level.

Research

  • Funding increases to develop mechanisms to coordinate with Alaska Native organizations to build partnerships and broaden participation in scientific research, and to strengthen PLR’s long-standing commitment to engaging Alaska Native students and communities. (+$200,000, to a total of $300,000)
  • PLR will collaborate with OCE to support the U.S. components of several marine expeditions under the GEOTRACES program to measure trace elements and isotopes across the Arctic Ocean and the connecting passages between the Arctic, Atlantic, and Pacific Oceans. (+$2.0 million, to a total of $2.0 million)
  • Core Arctic research will be prioritized to support reinvestment in sustained synthesis of modeling and observational data at the system level and for research on the Chukchi and Beaufort Seas, also taking an integrative system-level approach. These will complement the more focused research supported by mission agencies, such as the Bureau of Ocean Energy Management (BOEM) and the National Oceanic and Atmospheric Administration (NOAA), and contribute to the coordinated activity under IARPC. ($6.0 million)
  • Core Antarctic research investments will include prioritizing investments in remotely deployed observation instruments and improved climate modeling techniques and capabilities, as recommended in the NRC and BRP reports. ($4.0 million)
  • Funding for the Center for the Remote Sensing of Ice Sheets (CReSIS) (-$660,000, to a total $3.11 million) reflects the planned sunsetting of the center in FY 2015.

Education

  • Funding for the Research Experiences for Undergraduates (REU) Sites and Supplements program increases $300,000 over the FY 2012 Enacted level. The increased funding will support enhanced research experiences for students in their first two years of college, as recommended by the President’s Council of Advisors on Science and Technology (PCAST) in their report, Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics.

Infrastructure

  • Arctic Research Support and Logistics: The Arctic Research Support and Logistics program provides support for Arctic researchers, including airplanes, helicopters, access to icebreakers and field camps for researchers participating in approximately 150 projects in remote sites in Alaska, Canada, Arctic Scandinavia, Russia, and the Arctic Ocean. Summit Station on the Greenland icecap operates as a year-round international site for a variety of atmospheric and geophysical measurements. Increased funding will support anticipated ship-time on the new UNOLS vessel SIKULIAQ. (+$460,000, to a total of $460,000)
  • IceCube Neutrino Observatory: PLR continues to match the Directorate for Mathematical and Physical Science’s contribution of $3.45 million for operations and maintenance of the IceCube Observatory.
  • U.S. Antarctic Facilities and Logistics:
  • Funding provides all necessary infrastructure, instrumentation, and logistics for scientists from all disciplines performing research in Antarctica. This support includes forward staging facilities in New Zealand and South America; operation of three year-round stations in Antarctica; DoD fixed-wing aircraft, contracted rotary- and fixed-wing aircraft; two leased research vessels, and a leased icebreaker.
  • The NSF FY 2014 Budget Request for Antarctic Facilities and Logistics also focuses on implementing recommendations from the USAP BRP. NSF issued a formal response to the BRP in March 2013,14 and this Budget Request incorporates investments outlined in that document, totaling $18.0 million. These include:
    • Address resupply issues at Palmer Station by improving the condition of the pier and mitigating the underwater rock ledge that currently prevents larger vessels from docking at the station ($8.0 million). The small boating range will also be increased through the purchase of rigid hull inflatable boats ($1.0 million), which will allow for better access to areas of scientific interest and provide a more robust platform for the recovery of remote sensing systems, such as gliders and robotic underwater vehicles;
    • Improve fire suppression engineering through a study of fire suppression requirements and existing capabilities in order to target any areas needing improvement ($1.0 million);
    • Roboticize the overland traverse to increase productivity by increasing the number of trips each swing of the traverse can make in a single season to the South Pole or other interior scientific locations ($2.0 million). New tractors will be purchased for the heavy traverse ($4.0 million) to provide a standardized tractor fleet on which to build the robotics system; and
    • Consolidate warehousing activities to enable reductions in physical and personnel footprint. The first phase of the new McMurdo long-range plan involves construction of a single logistics and warehousing facility, as well as a consolidated trade shop and warehousing complex. Design work will begin on these facilities so that materials and equipment can be purchased for delivery to Antarctica on the 2015 resupply vessel. ($2.00 million)
  • Polar Environment, Safety, and Health: The Environment, Safety, and Health section provides oversight for these aspects of conducting and supporting research in the polar regions. Funds are invested in development of an electronic medical records system that will serve both polar regions. (+$180,000 to a total of $180,000)

Mathematical and Physical Sciences (MPS)

Program Title Mathematical and Physical Sciences (MPS)
Program Description Research in this activity is directed at increasing understanding of natural laws and phenomena across the astronomical sciences, chemistry, materials sciences, mathematical sciences, and physics. The Divisions within the Mathematical and Physical Sciences Directorate are: • Astronomical Sciences (MPS/AST) • Chemistry (MPS/CHE) • Materials Research (MPS/DMR) • Mathematical Sciences (MPS/DMS) • Physics (MPS/PHY) • Office of Multidisciplinary Activities (MPS/OMA) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/21_fy2014.pdf. (PDF)
Supported Strategic Goals Goal 1: Transform the Frontiers of Science and Engineering Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals Increase Data Scientists and Data Infrastructure; Optimize the Award Process to Level Workload; Ensure Public Access to Publications

 

Social, Behavioral and Economic Sciences (SBE)

Program Title Social, Behavioral, and Economic Sciences (SBE)
Program Description This activity supports research, education, and infrastructure in the social, behavioral, cognitive, and economic sciences and funds the collection and dissemination of statistics on the science and engineering enterprise. The Divisions within the Social, Behavioral, and Economic Sciences Directorate are: • Social and Economic Sciences (SBE/SES) • Behavioral and Cognitive Sciences (SBE/BCS) • Office of Multidisciplinary Activities (SBE/SMA) • National Center for Science and Engineering Statistics (SBE/NCSES) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/22_fy2014.pdf. (PDF)
Supported Strategic Goals Goal 1: Transform the Frontiers of Science and Engineering Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals Increase Data Scientists and Data Infrastructure; Optimize the Award Process to Level Workload; Ensure Public Access to Publications

 

International and Integrative Activities (IIA)

Program Title International and Integrative Activities (IIA)
Program Description This activity supports emerging cross disciplinary research efforts; major research instrumentation; and promotes an integrated strategy for international science and engineering that complements and enhances NSF's broader research and education goals and facilitates international collaboration. This activity also provides support for the Science and Technology Policy Institute. The Experimental Program to Stimulate Competitive Research broadens participation of States and regions in science and engineering by helping institutions expand their research capacity and competitiveness. The subactivities housed within the Office of International and Integrative Activities are: • Communicating Science Broadly (CSB) • The Experimental Program to Stimulate Competitive Research (EPSCoR) • Integrated NSF Support Promoting Interdisciplinary Research and Education (INSPIRE) • International Science and Engineering (ISE) • Major Research Instrumentation (MRI) • NSF Graduate Research Fellowships (NGRF) • NSF Research Traineeships (NRT) • Science and Technology Centers (STC) • Science and Technology Policy Institute (STPI) • Science and Technology for America’s Reinvestment (STAR METRICS) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/23_fy2014.pdf. (PDF)
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals Increase Data Scientists and Data Infrastructure; Optimize the Award Process to Level Workload;. Ensure Public Access to Publications

 

United States Arctic Research Commission (USARC)

Program Title United States Arctic Research Commission (USARC)
Program Description The United States Arctic Research Commission promotes Arctic research and recommends national Arctic research policies to guide Federal agencies in developing and implementing their research programs in the Arctic region.
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering 
Supported Strategic Objectives G1/03
Supported Agency Priority Goals Not applicable.

Note that USARC is an independent agency that is included in NSF’s program inventory but not covered by the NSF strategic plan.

Education and Human Resources (EHR)

Program Title Education and Human Resources (EHR)
Program Description This activity supports a comprehensive set of programs in all areas of science, technology, engineering, and mathematics (STEM), at all levels, inside and outside of school, to build a diverse, globally competent STEM workforce and a STEM-literate citizenry. EHR invests in research and development on STEM education and learning, and in scholarships and fellowships to build the STEM workforce. The Divisions within the Education and Human Resources Directorate are: • Research on Learning in Formal and Informal Settings (EHR/DRL) • Graduate Education (EHR/DGE) • Human Resource Development (EHR/HRD) • Undergraduate Education (EHR/DUE) Current information about the program can be found at http://nsf.gov/about/budget/fy2014/pdf/25_fy2014.pdf. (PDF)
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals Increase Data Scientists and Data Infrastructure; Optimize the Award Process to Level Workload;. Ensure Public Access to Publications

 

Major Research Equipment and Facilities Construction (MREFC)

Program Title Major Research Equipment and Facilities Construction (MREFC)
Program Description The Major Research Equipment and Facilities Construction activity supports the acquisition, construction, and commissioning of unique national research platforms and major research facilities and equipment. Performance of each construction project is measured against an established baseline at regular intervals and at major milestones.
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering
Supported Strategic Objectives G1/03
Supported Agency Priority Goals Optimize the Award Process to Level Workload

 

Agency Operations and Award Management (AOAM)

Program Title Agency Operations and Award Management (AOAM)
Program Description This account funds NSF's scientific, professional, and administrative workforce, the physical and technological infrastructure necessary for a productive, safe and secure work environment, and the essential business operations critical to NSF's administrative processes.
Supported Strategic Goals Strategic Goal 3: Excel as a Federal Science Agency
Supported Strategic Objectives G3/01, G3/02
Supported Agency Priority Goals Optimize the Award Process to Level Workload;. Ensure Public Access to Publications

 

Office of the National Science Board (NSB)

Program Title Office of the National Science Board (NSB)
Program Description This appropriation provides policy-making and related responsibilities for NSF and provides guidance on significant national policy issues in science and engineering research and education, as required by law.
Supported Strategic Goals Strategic Goal 1: Transform the Frontiers of Science and Engineering Strategic Goal 2: Stimulate Innovation and Address Societal Needs through Research and Education
Supported Strategic Objectives G1/01, G1/02, G1/03; G2/01, G2/02
Supported Agency Priority Goals None

 

Office of Inspector General (OIG)

Program Title Office of Inspector General (OIG)
Program Description This appropriation provides agency-wide audit and investigative functions to identify and correct management and administrative deficiencies which create conditions for existing or potential instances of fraud, waste, and mismanagement consistent with the Inspector General Act of 1978, as amended (5 U.S.C. App. 3).
Supported Strategic Goals Strategic Goal 3: Excel as a Federal Science Agency
Supported Strategic Objectives None
Supported Agency Priority Goals None

 

FY 2013 NSF Budget Request to Congress Performance

Source: http://www.nsf.gov/about/budget/fy20...nce_fy2013.pdf (PDF)

NSF Performance Context

The following information on NSF’s mission statement, organizational structure, and strategic goals is required to be included in the agency’s Annual Performance Plan and Annual Performance Report by the GPRA Modernization Act of 2010 (P.L.111-352).

Mission Statement

The NSF Act of 1950 (Public Law 81-507) states the Foundation’s mission: “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes.”

Strategic Plan and Performance Goals

NSF’s Strategic Plan, Empowering the Nation Through Discovery and Innovation: NSF Strategic Plan for Fiscal Years (FY) 2011-2016 1, lays out three strategic goals—Transform the Frontiers, Innovate for Society, and Perform as a Model Organization—that relate directly to this mission. This goal structure enables NSF to link its investments to longer-term outcomes. To bridge the gap between these strategic goals and measurable outputs, the Strategic Plan establishes a set of performance goals (also called strategic objectives) for each strategic goal:

1 http://www.nsf.gov/about/performance

Table 1
Strategic Goal Strategic Objectives / Performance Goals
Transform the Frontiers (T) emphasizes the seamless integration of research and education as well as the close coupling of research infrastructure and discovery. T-1: Make investments that lead to emerging new fields of science and engineering and shifts in existing fields. T-2: Prepare and engage a diverse science, technology, engineering, and mathematics (STEM) workforce motivated to participate at the frontiers. T-3: Keep the United States globally competitive at the frontiers of knowledge by increasing international partnerships and collaborations. T-4: Enhance research infrastructure and promote data access to support researchers’ and educators’ capabilities and to enable transformation at the frontiers.
Innovate for Society (I) points to the tight linkage between NSF programs and societal needs, and it highlights the role that new knowledge and creativity play in economic prosperity and society’s general welfare. I-1: Make investments that lead to results and resources that are useful to society. I-2: Build the capacity of the nation’s citizenry for addressing societal challenges through science and engineering. I-3: Support the development of innovative learning systems.
Perform as a Model Organization (M) emphasizes the importance to NSF of attaining excellence and inclusion in all operational aspects. M-1: Achieve management excellence through leadership, accountability, and personal responsibility. M-2: Infuse learning as an essential element of the NSF culture with emphasis on professional development and personal growth. M-3: Encourage and sustain a culture of creativity and innovation across the agency to ensure continuous improvement and achieve high levels of customer service. 

       

Under each of these performance goals are one or more strategic targets, 14 in all. In FY 2011, each strategic target was monitored through one or more annual performance goals, 16 in all. The NSF set 18 performance goals for FY 2012-FY 2013.

GPRA Modernization Act of 2010

The GPRA Modernization Act of 2010 (P.L.111-352, 31 U.S.C. 1115(b)(10)) revises the federal government’s performance management framework, retaining and amplifying certain aspects of the Government Performance and Results Act of 1993 (GPRA). The Act shifts the focus of its predecessor from the production of plans and reports to the use of goals and performance data to improve outcomes. Among other changes, it strengthens leadership engagement in setting ambitious goals, reviewing progress, and clearly communicating results.

The GPRA Modernization Act (GPRA-MA) institutes important changes to existing strategic planning, performance planning, and reporting requirements. GPRA-MA serves as a foundation for helping agencies to focus on their highest priorities and creating a culture where data and empirical evidence plays a greater role in policy, budget, and management decisions.

As required in the GPRA Modernization Act, this FY 2013 Congressional Budget Request includes the following reports on NSF’s performance framework

  • FY 2012 - 2013 Agency Priority Goals. This section also includes a report on the FY 2010-2011 Priority Goal
  • FY 2012 - 2013 Annual Performance Plan, Goals and Priorities section
  • FY 2011 Annual Performance Report, including information about the program evaluations completed in FY 2011
  • FY 2011-2016 Strategic Plan Addendum
  • Supporting Information, including discussion of the strategies and supporting analyses used in development of the FY 2012-2013 Plan and the methods used to verify and validate performance data.

Agency Priority Goals

FY 2012-FY 2013 Agency Priority Goals

NSF has set three priority goals for accomplishment in FY 2012 and FY 2013. These goals cover the range of programmatic activities that NSF supports, from basic research to training of the science and engineering workforce to education of the general public. The goals will require cross-agency coordination to make progress, and NSF is leveraging its experiences with FY 2010-FY 2011 Priority Goal achievement towards this end.

The information on the following pages about the FY 2012-2013 Goals is also available on NSF’s Performance.gov page. NSF also participates in cross-agency Federal Priority Goals. Please refer to Performance.gov for information on these Federal Priority Goals and NSF’s contributions to them.

FY 2012-FY 2013 NSF Agency Priority Goals
Table 2
Goal Short Title Impact Statement Goal Statement
Access to Digital Products of NSF-Funded Research Increase opportunities for research and education through public access to high-value digital products of NSF-funded research. By September 30, 2013, NSF will have established policies for public access to high-value data and software in at least two data-intensive scientific domains.
Undergraduate Programs Develop a diverse and highly qualified science and technology workforce. By September 30, 2013, 80 percent of institutions funded through NSF undergraduate programs document the extent of use of proven instructional practices.
NSF Innovation Corps   Increase the number of entrepreneurs emerging from university laboratories. By September 30, 2013, 80 percent of teams participating in the NSF Innovation Corps program will have tested the commercial viability of their product or service.

FY 2012-FY 2013 Priority Goal: Access to Digital Products of NSF-Funded Research

Impact Statement

Increase opportunities for research and education through public access to high-value digital products of NSF-funded research.

Goal Statement

By September 30, 2013, NSF will have established policies for public access to high-value data and software in at least two data-intensive scientific domains.

Lead Organization

Directorate for Mathematical and Physical Sciences

Relevant Strategic Objective/Performance Goal

Under the “Transform the Frontiers” strategic goal, Objective T-4, “Enhance research infrastructure and promote data access to support researchers’ and educators’ capabilities and enable transformation at the frontiers.”

Description

Digital data are increasingly one of the primary products of scientific research. As advanced by the National Science Board, open data sharing is closely linked to public access to scholarly publications resulting from federally funded unclassified research, and they should be considered in concert. The digital data underlying figures and the key findings in the literature should be accessible and linked to one another so that scientists can verify and reproduce major findings in the literature and repurpose the data to enable new discoveries. Simultaneously, access to digital products of research enhances openness and transparency in the scientific enterprise and enables new types of multi-disciplinary research and education. Therefore, it is increasingly important for NSF to facilitate and encourage access to data and research results. The priority goal supports this vision of increasingly collaborative and multi-disciplinary science by assuring that knowledge and data can flow easily across traditional disciplinary boundaries.

This goal is also linked to the National Science Foundation’s concept for “OneNSF”, which promotes collaboration in well-integrated and efficient ways across organizational and disciplinary boundaries.

Contributing Programs

The effort is led by the Assistant Director of the Mathematical and Physical Sciences Directorate. All programmatic directorates and offices will provide appropriate program staff for working groups. The NSF will help staff and liaise to related activities of the National Science Board. Personnel from the NSF Policy Office (Office of Budget and Finance and Award Management, Division of Institutional and Award Support) and NSF Information Systems (Office of Information and Resource Management, Division of Information Systems) will be required for working groups and implementation.

Other research agencies, such as the National Institutes of Health (NIH) and the Department of Energy (DOE), face similar challenges with regard to products of federally funded scientific research. NSF staff communicate with representatives of other agencies to identify best practices for facilitating access to data. Prototyping and implementation will engage a variety of external stakeholders.

FY 2012-FY 2013 Priority Goal: Undergraduate Programs

Impact Statement

Develop a diverse and highly qualified science and technology workforce.

Goal Statement

By September 30, 2013, 80 percent of institutions funded through NSF undergraduate programs document the extent of use of proven instructional practices.

Lead Organization

Directorate for Education and Human Resources (EHR).

Relevant Strategic Objective/Performance Goal

Under “Transform the Frontiers” strategic goal, T-2: “Prepare and engage a diverse STEM workforce motivated to participate at the frontiers.”

Description

NSF’s Strategic Plan emphasizes the pressing need for science and engineering workforce development and emphasizes diversity and innovation as aspects of high quality preparation and engagement. The current priority goal addresses NSF’s long-term core commitment to using undergraduate education to engage and prepare a diverse and highly qualified science and engineering workforce. Research shows that evidence-based instructional practices lead to improved student learning, and thus are a useful metric for assessing impact on a well-prepared workforce. Therefore, one way that NSF can advance its efforts to invest in the preparation of a strong science and engineering workforce is by encouraging and facilitating the use of empirically-based instructional practices in undergraduate science, technology, engineering, and mathematics (STEM) education. To do this first means establishing a baseline about the use of such practices.

Implementation of this goal will include the design, piloting, and testing of a suite of methods to gather information about the nature of undergraduate STEM instructional practice in institutions. These methods will enable academic institutions to benchmark their instructional practices in STEM fields.

By September 30, 2013, 80 percent of the institutions funded through NSF undergraduate programs will be expected to document the extent of their use of evidence-based instructional practices.

Contributing Programs

All NSF programs that support undergraduate science, technology, engineering, and mathematics (STEM) education.

FY 2012-FY 2013 Priority Goal: NSF Innovation Corps

Impact Statement

Increase the number of entrepreneurs emerging from university laboratories.

Goal Statement

By September 30, 2013, 80 percent of teams participating in the NSF Innovation Corps program will have tested the commercial viability of their product or service.

Lead Organization

Directorate for Engineering Relevant Strategic

Objective/Performance Goal

Under “Innovate for Society” strategic goal, Objective I-1, “Make investments that lead to results and resources that are useful to society,” and Objective I-2, “Build the capacity of the nation’s citizenry for addressing societal challenges through science and engineering.”

Description

Through the NSF Innovation Corps (I-Corps) program, NSF seeks to accelerate the development of new technologies, products and processes that arise from fundamental research. The goals of I-Corps are to spur translation of fundamental research, to encourage collaboration between academia and industry, and to train students to understand innovation and entrepreneurship. With I-Corps, NSF supports NSF-funded researchers whose efforts will be augmented - in the form of mentoring and funding - to accelerate the translation of knowledge derived from fundamental research into emerging products and services that can attract subsequent third party funding. NSF investments will strategically strengthen the innovation ecosystem (http://www.nsf.gov/eng/iip/innovation.pdf) by addressing the challenges inherent in the early stages of the innovation process.

Implementation of this goal will require the development of mechanisms and partnerships to support I-Corps teams. The teams will consist of three people and will include an NSF Principal Investigator (PI), an Entrepreneurial Lead (typically a graduate student) and a mentor from the private sector. Teams will propose a specific service or product concept for six months of I-Corps-funded development and testing. The concepts will have arisen from basic research led by the PI and funded by NSF within the previous five years. The selected teams will proceed through an intensive prescribed curriculum designed to develop their entrepreneurial skills and to ensure that the critical assumptions underpinning their perceived opportunities are tested. At the end of the six months, teams will decide whether or not to proceed with development of their concept.

Contributing Programs

NSF Innovation Corps, http://www.nsf.gov/news/special_reports/i-corps/

FY 2010-FY 2011 Priority Goal Report

NSF had one priority goal for accomplishment in FY 2010 and FY 2011. A summary of this priority goal, including the goal’s achievements, is provided below.

FY 2010-FY2011 Priority Goal: Science, Technology, Engineering and Mathematics (STEM) Workforce Development

Goal Statement

By the end of 2011, at least six major NSF science, technology, engineering and mathematics (STEM) workforce development programs at the graduate, postdoctoral, or early career level have evaluation and assessment systems providing findings enabling program re-design or consolidation for more strategic impact.

Lead Organization

Directorate for Education and Human Resources

Relevant Strategic Objective/Performance Goal

Transform the Frontiers strategic goal, subgoal T-2: “Prepare and engage a diverse STEM workforce motivated to participate at the frontiers.”

Result: Achieved. Twelve programs reached the target

NSF’s goal aimed to strengthen NSF’s internal culture and practices to rely more heavily on assessment and evaluation for decision-making and grants program improvement. The NSF Priority Goal (PG) target was for six of the twenty-five participating grants programs to have evaluation and assessment systems capable of providing enough information for program re-design or consolidation. A theoretical model of program design and evaluation was developed to help provide a framework for assessment and measurement.

At the close of the performance period, September 30, 2011, materials from each of the twenty-five programs were rated by an expert against the theoretical model. Twelve of the twenty-five programs in the target program universe reached the goal, identified below by NSF managing directorate or office and program focus:

Table 3
Program Directorate Focus
Alliances for Graduate Education and the Professoriate (AGEP) EHR/SBE Graduate
Faculty Early Career Development Program (CAREER) EHR Early Career
Fellowships for Transformative Computational Science using Cyberinfrastructure (CI-TraCS) OCI Postdoctoral
Earth Sciences Postdoctoral Fellowship (EAR-PF) GEO Postdoctoral
Graduate Research Fellowship Program (GRF) EHR Graduate
Integrative Graduate Education and Research Traineeship Program (IGERT) EHR Graduate
International Research Fellowship Program (IRFP) OISE Postdoctoral
Mathematical Sciences Postdoctoral Research Fellowships (MSPRF) MPS Postdoctoral
Robert Noyce Teacher Scholarship Program (NOYCE) EHR Postdoctoral
Opportunities for Enhancing Diversity in the Geosciences (OEDG) GEO Graduate
Program Directorate Focus Postdoctoral Research Fellowship in Biology (PRFB) BIO Postdoctoral
Scholarship for Service/Cybercorps (SFS) EHR Graduate

                            

All twenty-five programs made progress towards achieving the goal. Of the thirteen programs that did not achieve the goal, four have not been active long enough to meet all the criteria of the theoretical model.

A more comprehensive report is available for download at http://www.nsf.gov/about/budget/fy2013. Posting of this report addresses concerns about transparency and accountability, which were raised by an Office of the Inspector General audit in FY 2011 of NSF’s process for achieving Priority Goals. The audit found that “the detail and documentation provided to support milestone accomplishment was inadequate and did not provide for the transparency and accountability intended of priority goal processes.” However, it also stated:

NSF has taken steps consistent with OMB guidance related to coordinating, measuring, monitoring, and communicating progress towards achieving its priority goal. Through the priority goal process, NSF appears to be moving towards a foundation of increased performance data-driven program evaluation and assessment, as evidenced by improvements in both the quantity and quality of the performance information included in its budget requests to Congress. 1

1 http://www.nsf.gov/oig/11-2-008.pdf

Lessons Learned and Future Directions

NSF will not continue this Priority Goal in FY 2012-2013, but will continue efforts to bring a stronger orientation toward evidence as a basis for program improvement. Changing culture is a slow process, but, as a direct result of undertaking this Priority Goal, a culture of evaluation and performance assessment is gaining momentum across the Foundation. Agency-wide collaborations will continue.

Unanticipated positive impacts of the Priority Goal process include:

  • As a result of their involvement in the Priority Goal activities, staff leaders of the twelve postdoctoral and early career programs increased their level of collaboration. One major activity undertaken, for example, was the development of a common logic model for postdoctoral programs which could serve as an umbrella for individual programs. And, the programs collectively convened a group of postdoctoral grantees to discuss how to best improve postdoctoral programming. In the future, this effort could help the programs make maximal use of limited resources, improve assessment, and share best practices to improve effectiveness.
  • Partially in response to needs of program staff working on the Priority Goal activities, in FY 2011 NSF’s Office of Integrative Activities (OIA) began developing capabilities for NSF-wide data mining and analysis of available program information. The office collaborated with a number of Priority Goal programs to attempt to address specific questions formulated by the programs as part of the Priority Goal process. This new capacity for improved data mining of existing program documents provided substantial evidence for evaluative analysis of the programs participating in the PG activity. Given NSF’s commitment to developing Foundation-wide evaluation capability, the OIA participation in this component of the Priority Goal was critical in helping to build needed resources.
  • Ongoing discussion of metrics and monitoring systems for STEM education and workforce development, in programs across the agency, has benefitted from the Priority Goal activity, and there are new efforts for a range of programs to work with logic models and goal development.

FY 2012 – FY 2013 Annual Performance Plan

Introduction

This Annual Performance Plan, together with other sections of this chapter, addresses the topics specified in the GPRA Modernization Act. One exception however, is the topic of Federal Priority Goals. Per the GPRA Modernization Act, P.L. 111-352, requirement to address Federal Goals in the agency Strategic Plan and Annual Performance Plan, please refer to Performance.gov for information on Federal Priority Goals and NSF’s contributions to those goals, where applicable.

NSF’s FY 2012 and FY 2013 performance goals are presented in the table below in context with their associated strategic goal. As in FY 2011, all program activities within the agency are covered, and all longer-term objectives under each strategic goal are covered.

Table 4
Strategic Goal FY 2012-FY 2013 Performance Goal New or continuing activity?
Transform the Frontiers T-1.1 INSPIRE* T-2.1 Priority Goal, Undergraduate Programs T-2.2 Career-Life Balance T-3.1 International Implications T-4.1 Construction Project Monitoring T-4.2 Priority Goal, Access to Digital Products Continuing New New Continuing Continuing New
Innovate for Society I-1.1 Priority Goal, Innovation Corps I-1.2 Industrial and Innovation Partnerships I-2.1 Public Understanding and Communication I-2.2 K-12 Scale-up I-3.1 Innovative Learning Systems New Continuing Continuing Continuing Continuing
Perform as a Model Organization M-1.1 Model EEO* Agency M-1.2 IPA* Performance Plans M-1.3 Performance Management System M-2.1 Assess Developmental Needs M-3.1 Financial System Modernization M-3.2 Time To Decision M-3.3 Virtual Panels Continuing Continuing New Continuing Continuing Continuing New

*Acronyms:

INSPIRE: Integrated NSF Support Promoting Interdisciplinary Research and Education

EEO: Equal Employment Opportunity

IPA: Intergovernmental Personnel Act

This FY 2012-FY 2013 Plan builds on NSF’s FY 2011 Performance Plan, which was the first under a new set of Strategic Goals introduced by NSF’s FY 2011-FY 2016 Strategic Plan, Empowering the Nation Through Discovery and Innovation. Almost all FY 2011 goals continue into FYs 2012 and 2013, and several activities with no FY 2011 precedents have been added as performance goals, including goals to advance the OneNSF framework (INSPIRE, Expeditions in Education, Innovation Corps) and the Career-Life Balance Initiative. NSF’s three Priority Goals for FY 2012 and FY 2013 are also included in this Plan.

In FY 2012 and FY 2013, NSF expects to continue refining its implementation of the GPRA Modernization Act. This will involve internal process modifications to integrate lessons learned from past experience with ideas from the growing performance management community of practice in the Federal government.

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-1

Make investments that lead to emerging new fields of science and engineering and shifts in existing fields.

Strategic Target

The NSF portfolio fully incorporates emerging areas with transformative potential, including those forming at disciplinary boundaries.

Goal T-1.1 INSPIRE (Integrated NSF Support Promoting Interdisciplinary Research and Education)
Table 5
Fiscal Year 2012 2013
Statement Strengthen support of unusually novel, potentially transformative, interdisciplinary research (IDR), through new funding mechanisms, systems, and incentives that facilitate and encourage IDR. Strengthen support of unusually novel, potentially transformative, interdisciplinary research (IDR), through new funding mechanisms, systems, and incentives that facilitate and encourage IDR.
Target Measure, Milestone, or Deliverable By September 30, 2012, • Track 1: Gather baseline data on NSF-supported IDR. • Track 2: Make 25 awards via the pilot CREATIV (Creative REsearch Awards for Transformative Interdisciplinary Ventures) mechanism. By September 30, 2013, • Track 1: Modify NSF’s eBusiness systems to facilitate co-review and management of proposals by multiple divisions, and to ease tracking of co-funded IDR. • Track 2: Award up to one-third of FY 2013 INSPIRE funds via the CREATIV mechanism. • Track 2: Establish a second pilot award mechanism for funding mid-scale IDR (up to $3 million), and make first round of awards.
Explanation INSPIRE was established to address some of the most complicated and pressing scientific problems that lie at the intersections of traditional disciplines. INSPIRE will strengthen NSF’s support of interdisciplinary, potentially transformative research by complementing existing efforts with a suite of new, highly innovative Foundation-wide activities and funding opportunities. The INSPIRE program has two goals. The first goal is to provide NSF program officers with the necessary tools and management support to empower cross-cutting collaboration and risk-taking in developing and managing their awards portfolio. The second goal is for researchers to submit, and NSF to support, a greater proportion of unusually novel, creative interdisciplinary proposals. For more information about INSPIRE’s background, goals, design, and investment and evaluation framework, refer to the NSF-Wide Investments tab. INSPIRE was established to address some of the most complicated and pressing scientific problems that lie at the intersections of traditional disciplines. INSPIRE will strengthen NSF’s support of interdisciplinary, potentially transformative research by complementing existing efforts with a suite of new, highly innovative Foundation-wide activities and funding opportunities. The INSPIRE program has two goals. The first goal is to provide NSF program officers with the necessary tools and management support to empower cross-cutting collaboration and risk-taking in developing and managing their awards portfolio. The second goal is for researchers to submit, and NSF to support, a greater proportion of unusually novel, creative interdisciplinary proposals. For more information about INSPIRE’s background, goals, design, and investment and evaluation framework, refer to the NSF-Wide Investments tab.
Potential Methods and Processes Track 1 of INSPIRE seeks to make changes to NSF systems and practices that will facilitate identification, review, support, management, and tracking of IDR. Activities will encompass improvements in business practices, funding culture, training, and evaluation. The pilot CREATIV (Creative Research Awards for Transformative Interdisciplinary Ventures) award mechanism will invest all of the INSPIRE funds for FY 2012. CREATIV awards, with a maximum award size of $1.0 million, will generally be internally reviewed and will support bold high-risk IDR projects that investigators may be reluctant to submit to a conventional review process. The CREATIV funding mechanism will be open to ideas on any NSF-supported topic as long as they are interdisciplinary and potentially transformative. To facilitate later evaluation of INSPIRE, baseline data on NSF-supported IDR will be gathered, and internal and external surveys about support of IDR and potentially transformative research (PTR) will be conducted. In FY 2013, the 2012 portfolio of CREATIV awards will be analyzed to determine whether the new mechanism is resulting in types of awards that were not being funded with previous mechanisms. Case studies and qualitative assessments of the review process for projects with transformative results are expected to provide helpful information. In FY 2013, the INSPIRE awards activities will continue, supporting the second year of the CREATIV pilot and expanding to include larger “mid-scale” awards up to the range of $2.5-3.0 million. This second pilot INSPIRE mechanism will be open to IDR proposals on any NSF-supported topic and will utilize novel merit review mechanisms involving both internal and external review. Directorates and offices will co-fund CREATIV and mid-scale awards, together with centralized funds from the Office of Integrative Activities. This and future reports on this activity to support fundamental, high-risk, and potentially transformative research is provided also per Section 1008 of the 2007 America COMPETES Act. Track 1 of INSPIRE seeks to make changes to NSF systems and practices that will facilitate identification, review, support, management, and tracking of IDR. Activities will encompass improvements in business practices, funding culture, training, and evaluation. The pilot CREATIV (Creative Research Awards for Transformative Interdisciplinary Ventures) award mechanism will invest all of the INSPIRE funds for FY 2012. CREATIV awards, with a maximum award size of $1.0 million, will generally be internally reviewed and will support bold high-risk IDR projects that investigators may be reluctant to submit to a conventional review process. The CREATIV funding mechanism will be open to ideas on any NSF-supported topic as long as they are interdisciplinary and potentially transformative. To facilitate later evaluation of INSPIRE, baseline data on NSF-supported IDR will be gathered, and internal and external surveys about support of IDR and potentially transformative research (PTR) will be conducted. In FY 2013, the 2012 portfolio of CREATIV awards will be analyzed to determine whether the new mechanism is resulting in types of awards that were not being funded with previous mechanisms. Case studies and qualitative assessments of the review process for projects with transformative results are expected to provide helpful information. In FY 2013, the INSPIRE awards activities will continue, supporting the second year of the CREATIV pilot and expanding to include larger “mid-scale” awards up to the range of $2.5-3.0 million. This second pilot INSPIRE mechanism will be open to IDR proposals on any NSF-supported topic and will utilize novel merit review mechanisms involving both internal and external review. Directorates and offices will co-fund CREATIV and mid-scale awards, together with centralized funds from the Office of Integrative Activities. This and future reports on this activity to support fundamental, high-risk, and potentially transformative research is provided also per Section 1008 of the 2007 America COMPETES Act.
Trend information INSPIRE is a new activity in FY 2012. Its centralized IA funds and interdisciplinary aspects have no clear precedent at NSF. Baseline data is yet to be gathered. NSF has been experimenting with how to support potentially transformative research (PTR) for several years. The EAGER (Early-concept Grants for Exploratory Research) mechanism, first used in FY 2009, is designed to support small-scale PTR (two years, <$300,000). In FY 2010, the directorates funded by the Research and Related Activities appropriations account allocated a total of $138.44 million to explore different methodologies to support PTR, which created a set of awards that are collectively considered case studies for the various methodologies used. The tracking of this activity was a GPRA performance goal in FY 2010 and FY 2011. INSPIRE is a new activity in FY 2012. Its centralized IA funds and interdisciplinary aspects have no clear precedent at NSF. Baseline data is yet to be gathered. NSF has been experimenting with how to support potentially transformative research (PTR) for several years. The EAGER (Early-concept Grants for Exploratory Research) mechanism, first used in FY 2009, is designed to support small-scale PTR (two years, <$300,000). In FY 2010, the directorates funded by the Research and Related Activities appropriations account allocated a total of $138.44 million to explore different methodologies to support PTR, which created a set of awards that are collectively considered case studies for the various methodologies used. The tracking of this activity was a GPRA performance goal in FY 2010 and FY 2011.
Lead Organization Office of Integrative Activities Office of Integrative Activities

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-2

Prepare and engage a diverse science, technology, engineering, and mathematics (STEM) workforce motivated to participate at the frontiers.

Strategic Target

NSF STEM workforce development programs, models, or strategies have rigorous evidence about the impact on diversity and innovation in the workforce.

Goal T-2.1 PRIORITY GOAL: Undergraduate Programs
Table 6
Fiscal Year 2012 2013
Statement Develop a diverse and highly qualified science and technology workforce. Develop a diverse and highly qualified science and technology workforce.
Target Measure, Milestone, or Deliverable By September 30, 2013, 80 percent of institutions funded through NSF undergraduate programs document the extent of use of proven instructional practices. By September 30, 2013, 80 percent of institutions funded through NSF undergraduate programs document the extent of use of proven instructional practices.
Explanation NSF’s Strategic Plan emphasizes the pressing need for science and engineering workforce development and emphasizes diversity and innovation as aspects of high quality preparation and engagement. The current priority goal addresses NSF’s long-term core commitment to using undergraduate education to engage and prepare a diverse and highly qualified science and engineering workforce. Research shows that evidence-based instructional practices lead to improved student learning, and thus are a useful metric for assessing impact on a well-prepared workforce. Therefore, one way that NSF can advance its efforts to invest in the preparation of a strong science and engineering workforce is by encouraging and facilitating the use of empirically-based instructional practices in undergraduate science, technology, engineering, and mathematics (STEM) education. To do this first means establishing a baseline about the use of such practices. Implementation of this goal will include the design, piloting, and testing of a suite of methods to gather information about the nature of undergraduate STEM instructional practice in institutions. These methods will enable academic institutions to benchmark their instructional practices in STEM fields. By September 30, 2013, 80 percent of the institutions funded through NSF undergraduate programs will be expected to document the extent of their use of evidence-based instructional practices.  NSF’s Strategic Plan emphasizes the pressing need for science and engineering workforce development and emphasizes diversity and innovation as aspects of high quality preparation and engagement. The current priority goal addresses NSF’s long-term core commitment to using undergraduate education to engage and prepare a diverse and highly qualified science and engineering workforce. Research shows that evidence-based instructional practices lead to improved student learning, and thus are a useful metric for assessing impact on a well-prepared workforce. Therefore, one way that NSF can advance its efforts to invest in the preparation of a strong science and engineering workforce is by encouraging and facilitating the use of empirically-based instructional practices in undergraduate science, technology, engineering, and mathematics (STEM) education. To do this first means establishing a baseline about the use of such practices. Implementation of this goal will include the design, piloting, and testing of a suite of methods to gather information about the nature of undergraduate STEM instructional practice in institutions. These methods will enable academic institutions to benchmark their instructional practices in STEM fields. By September 30, 2013, 80 percent of the institutions funded through NSF undergraduate programs will be expected to document the extent of their use of evidence-based instructional practices. 
Potential Methods and Processes Action Plan will be posted on performance.gov in Summer 2012. Action Plan will be posted on performance.gov in Summer 2012.
Trend information NA. This Priority Goal is a newly developed activity for FY 2012-FY 2013. NA. This Priority Goal is a newly developed activity for FY 2012-FY 2013.
Lead Organization Directorate for Education and Human Resources Directorate for Education and Human Resources

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-2

Prepare and engage a diverse science, technology, engineering, and mathematics (STEM) workforce motivated to participate at the frontiers.

Strategic Target

NSF STEM workforce development programs, models, or strategies have rigorous evidence about the impact on diversity and innovation in the workforce.

Goal T-2.2 Career-Life Balance
Table 7
Fiscal Year 2012 2013
Statement Promote Career-Life Balance policies and practices that support more fully utilizing the talents of individuals in all sectors of the American population – principally women, underrepresented minorities and persons with disabilities. Promote Career-Life Balance policies and practices that support more fully utilizing the talents of individuals in all sectors of the American population – principally women, underrepresented minorities and persons with disabilities.
Target Measure, Milestone, or Deliverable By September 30, 2012, establish the FY 2012 baseline for number and value of award support provided to CAREER awardees and postdoctoral fellows intended to fund research technicians. By September 30, 2013, • Establish the FY 2013 baseline for number and value of awards provided to ADVANCE institutions intended to fund dual career supports. • Increase the number and value of research technician award support provided to CAREER awardees and postdoctoral fellows by 10 percent over FY 2012.
Explanation Although women comprise a significant and growing fraction of the U.S. STEM talent pool, recent studies demonstrate the challenges that they face when attempting to balance the often extreme demands of career and life without adequate institutional support. Utilizing women’s talent and potential in STEM fields is critical to the nation’s future success in science and technology and to economic prosperity. To address this challenge, NSF’s Career-Life Balance (CLB) Initiative, a set of forward-looking policies and practices, will help to increase the placement, advancement, and retention of women in STEM disciplines, particularly women who are seeking tenure in academe. NSF aims to enhance existing – and implement new – gender-neutral, family-friendly policies, as it is important that our nation’s colleges and universities accommodate the needs of the largest-growing segment of our science and engineering workforce. The Foundation is pursuing an agency-level pathway approach across higher education and career levels (i.e., graduate students, postdoctoral students, and early-career scientists, and engineers). CLB seeks new and innovative ways in which NSF can partner with U.S. universities, colleges, and research institutions to help attract, nurture, and retain a much greater fraction of women engineers and scientists in the nation’s STEM workforce. In FY 2012 and FY 2013, NSF will introduce CLB supports for technicians for awardees (CAREER and postdoctoral fellows) who need temporary help to continue research while facing the demands of child and/or elder dependent care. In FY 2013, the NSF will introduce a support through the ADVANCE Institutional Transformation award for hiring dual-career spouses. Achievement of the FY 2012 and FY 2013 targets will signify significant new CLB supports for postdoctoral students and early-career scientists and engineers. Although women comprise a significant and growing fraction of the U.S. STEM talent pool, recent studies demonstrate the challenges that they face when attempting to balance the often extreme demands of career and life without adequate institutional support. Utilizing women’s talent and potential in STEM fields is critical to the nation’s future success in science and technology and to economic prosperity. To address this challenge, NSF’s Career-Life Balance (CLB) Initiative, a set of forward-looking policies and practices, will help to increase the placement, advancement, and retention of women in STEM disciplines, particularly women who are seeking tenure in academe. NSF aims to enhance existing – and implement new – gender-neutral, family-friendly policies, as it is important that our nation’s colleges and universities accommodate the needs of the largest-growing segment of our science and engineering workforce. The Foundation is pursuing an agency-level pathway approach across higher education and career levels (i.e., graduate students, postdoctoral students, and early-career scientists, and engineers). CLB seeks new and innovative ways in which NSF can partner with U.S. universities, colleges, and research institutions to help attract, nurture, and retain a much greater fraction of women engineers and scientists in the nation’s STEM workforce. In FY 2012 and FY 2013, NSF will introduce CLB supports for technicians for awardees (CAREER and postdoctoral fellows) who need temporary help to continue research while facing the demands of child and/or elder dependent care. In FY 2013, the NSF will introduce a support through the ADVANCE Institutional Transformation award for hiring dual-career spouses. Achievement of the FY 2012 and FY 2013 targets will signify significant new CLB supports for postdoctoral students and early-career scientists and engineers.
Potential Methods and Processes The following methods will be used: • A Working Group with representation from across the agency to coordinate activities. • Communications strategy to raise awareness around the Foundation and in the S&E community about the CLB initiative (e.g. dedicated dynamic webpage, email announcements, town hall meetings, conferences, and activity fairs). • Training of NSF staff in opportunities and resources, and award management provided through CLB. • Establishment of unique accounting codes to facilitate financial reporting of CLB supports. • Individual program officers to work with grantees to identify opportunities to use the supports. • Development of an integrated documentation, assessment and evaluation approach to guide and determine the added value of the CLB initiative. The following methods will be used: • A Working Group with representation from across the agency to coordinate activities. • Communications strategy to raise awareness around the Foundation and in the S&E community about the CLB initiative (e.g. dedicated dynamic webpage, email announcements, town hall meetings, conferences, and activity fairs). • Training of NSF staff in opportunities and resources, and award management provided through CLB. • Establishment of unique accounting codes to facilitate financial reporting of CLB supports. • Individual program officers to work with grantees to identify opportunities to use the supports. • Development of an integrated documentation, assessment and evaluation approach to guide and determine the added value of the CLB initiative.
Trend information NA. The CLB Initiative is new in FY 2012. Such support provided in past years was not trackable. NA. The CLB Initiative is new in FY 2012. Such support provided in past years was not trackable.
Lead Organization Office of the Director Office of the Director

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-3

Keep the United States globally competitive at the frontiers of knowledge by increasing international partnerships and collaborations.

Strategic Target

NSF programs increasingly establish international partnerships that advance the frontiers of knowledge.

Goal T-3.1 International Implications
Table 8
Fiscal Year 2012 2013
Statement Increase proportion of new NSF solicitations, announcements, and Dear Colleague Letters that have international implications. Increase proportion of new NSF solicitations, announcements, and Dear Colleague Letters that have international implications.
Target Measure, Milestone, or Deliverable Increase proportion of new NSF solicitations, announcements, and Dear Colleague Letters that have international implications by 10 percent over FY 2011. Increase proportion of new NSF solicitations, announcements, and Dear Colleague Letters that have international implications by 10 percent over FY 2012.
Explanation As science and engineering expertise and infrastructure advance across the globe, it is expected that the United States will increasingly benefit from international collaborations and a globally engaged workforce leading to transformational science and engineering breakthroughs. To this end, NSF promotes cooperation among scientists and engineers from all nations and encourages funding of international collaborative activities through all of our programs. By supporting institutions that collaborate on research, education and related activities with international colleagues, U.S. scientists and engineers gain access to unique facilities and research sites and to partnerships with the global research community. In these ways, they are able to augment what might otherwise be purely domestic activities and resources in their field and have an opportunity to better understand the increasingly global character of science and engineering. In NSF’s internal document clearance process, program officers have the opportunity to indicate whether a given solicitation, announcement, or Dear Colleague Letter has international implications. For example, study of earthquakes may require international travel or collaboration, so a solicitation on the topic may have implications for international activity. OISE will work with NSF directorates and offices to identify opportunities to expand international engagements among NSF activities and incorporate them into these proposal-generating documents. As science and engineering expertise and infrastructure advance across the globe, it is expected that the United States will increasingly benefit from international collaborations and a globally engaged workforce leading to transformational science and engineering breakthroughs. To this end, NSF promotes cooperation among scientists and engineers from all nations and encourages funding of international collaborative activities through all of our programs. By supporting institutions that collaborate on research, education and related activities with international colleagues, U.S. scientists and engineers gain access to unique facilities and research sites and to partnerships with the global research community. In these ways, they are able to augment what might otherwise be purely domestic activities and resources in their field and have an opportunity to better understand the increasingly global character of science and engineering. In NSF’s internal document clearance process, program officers have the opportunity to indicate whether a given solicitation, announcement, or Dear Colleague Letter has international implications. For example, study of earthquakes may require international travel or collaboration, so a solicitation on the topic may have implications for international activity. OISE will work with NSF directorates and offices to identify opportunities to expand international engagements among NSF activities and incorporate them into these proposal-generating documents.
Potential Methods and Processes The number of new NSF solicitations, announcements, and Dear Colleague Letters that have international implications will be compared to the total number of such documents issued by NSF in each year.  The number of new NSF solicitations, announcements, and Dear Colleague Letters that have international implications will be compared to the total number of such documents issued by NSF in each year. 
Trend information This was a new goal in FY 2011 under NSF’s new strategic plan framework. OISE conducted a count of these materials in FY 2011 and established a baseline of 23 solicitations, announcements, and Dear Colleague Letters. FY 2011 baseline: 20 percent (23 of 116 solicitations, announcements, and Dear Colleague Letters issued in FY 2011.) FY 2012 target: 22 percent of solicitations, announcements, and Dear Colleague Letters issued in FY 2012. FY 2013 target: 24 percent of solicitations, announcements, and Dear Colleague Letters issued in FY 2013. This was a new goal in FY 2011 under NSF’s new strategic plan framework. OISE conducted a count of these materials in FY 2011 and established a baseline of 23 solicitations, announcements, and Dear Colleague Letters. FY 2011 baseline: 20 percent (23 of 116 solicitations, announcements, and Dear Colleague Letters issued in FY 2011.) FY 2012 target: 22 percent of solicitations, announcements, and Dear Colleague Letters issued in FY 2012. FY 2013 target: 24 percent of solicitations, announcements, and Dear Colleague Letters issued in FY 2013.
Lead Organization Office of International Science and Engineering Office of International Science and Engineering

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-4

Enhance research infrastructure and promote data access to support researchers’ and educators’ capabilities and to enable transformation at the frontiers.

Strategic Target

NSF prioritizes and manages facility investments throughout their life-cycle in a transparent and effective way.

Goal T-4.1 Construction Project Monitoring
Table 9
Fiscal Year 2012 2013
Statement For all MREFC facilities under construction, keep negative cost and schedule variance at or below 10 percent. For all MREFC facilities under construction, keep negative cost and schedule variance at or below 10 percent.
Target Measure, Milestone, or Deliverable 100 percent of construction projects that are over 10 percent complete.  100 percent of construction projects that are over 10 percent complete. 
Explanation The Major Research Equipment and Facilities Construction (MREFC) account supports the acquisition, construction, and commissioning of major research facilities and equipment that provide unique capabilities at the frontiers of science and engineering. This goal provides a monitoring component for NSF’s “no cost overrun” policy, discussed on page 2 of the MREFC chapter. The 10 percent target is consistent with OMB and Congressional guidelines for large projects. The Major Research Equipment and Facilities Construction (MREFC) account supports the acquisition, construction, and commissioning of major research facilities and equipment that provide unique capabilities at the frontiers of science and engineering. This goal provides a monitoring component for NSF’s “no cost overrun” policy, discussed on page 2 of the MREFC chapter. The 10 percent target is consistent with OMB and Congressional guidelines for large projects.
Potential Methods and Processes Performance of construction projects funded by the MREFC account is monitored using the Earned Value Management (EVM) system. EVM is an integrated management control system for assessing, understanding, and quantifying what a contractor or field activity is achieving with program dollars. Monitoring cost and schedule is a standard measure of performance for construction projects. Projects that are under ten percent complete are not considered eligible for this goal because EVM data is less meaningful statistically in the very early stages of a project. Early in a project, the actual costs of the work, and the total values of the work scheduled and performed, are small compared to the total project cost and schedule. Consequently, their ratios - the reported cost and schedule variances - can change by large amounts even though the real values of their differences are small. Performance of construction projects funded by the MREFC account is monitored using the Earned Value Management (EVM) system. EVM is an integrated management control system for assessing, understanding, and quantifying what a contractor or field activity is achieving with program dollars. Monitoring cost and schedule is a standard measure of performance for construction projects. Projects that are under ten percent complete are not considered eligible for this goal because EVM data is less meaningful statistically in the very early stages of a project. Early in a project, the actual costs of the work, and the total values of the work scheduled and performed, are small compared to the total project cost and schedule. Consequently, their ratios - the reported cost and schedule variances - can change by large amounts even though the real values of their differences are small.
Trend information

Monitoring construction projects using the EVM method is an ongoing practice at NSF. Construction Project Monitoring performance trends, FY 2006-FY 2011 (See Below)

Monitoring construction projects using the EVM method is an ongoing practice at NSF. Construction Project Monitoring performance trends, FY 2006-FY 2011 (See Below)
Lead Organization Large Facilities Office, Office of Budget, Finance, and Award Management Large Facilities Office, Office of Budget, Finance, and Award Management
Construction Project Monitoring performance trends, FY 2006-FY 2011
Figure 1

Construction Project Monitoring performance trends, FY 2006-FY 2011A.png

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-4

Enhance research infrastructure and promote data access to support researchers’ and educators’ capabilities and to enable transformation at the frontiers.

Strategic Target

Ensure data generated by NSF’s major multi-user facilities are widely accessible to the research community.

Goal T-4.2 PRIORITY GOAL: Access to Digital Products of NSF-Funded Research
Table 10
Fiscal Year 2012 2013
Statement Increase opportunities for research and education through public access to high-value digital products of NSF-funded research. Increase opportunities for research and education through public access to high-value digital products of NSF-funded research.
Target Measure, Milestone, or Deliverable By September 30, 2013, NSF will have established policies for public access to high-value data and software in at least two data-intensive scientific domains. By September 30, 2013, NSF will have established policies for public access to high-value data and software in at least two data-intensive scientific domains.
Explanation Digital data are increasingly one of the primary products of scientific research. As advanced by the National Science Board, open data sharing is closely linked to public access to scholarly publications resulting from federally-funded unclassified research, and they should be considered in concert. The digital data underlying figures and the key findings in the literature should be accessible and linked to one another so that scientists can verify and reproduce major findings in the literature and repurpose the data to enable new discoveries. Simultaneously, access to digital products of research enhances openness and transparency in the scientific enterprise and enables new types of multi-disciplinary research and education. Therefore, it is increasingly important for NSF to facilitate and encourage access to data and research results. The priority goal supports this vision of increasingly collaborative and multi-disciplinary science by assuring that knowledge and data can flow easily across traditional disciplinary boundaries. This goal is also linked to the National Science Foundation’s concept for “OneNSF”, which promotes collaboration in well-integrated and efficient ways across organizational and disciplinary boundaries. The effort is led by the Assistant Director of the Mathematical and Physical Sciences Directorate. All programmatic directorates and offices will provide appropriate program staff for working groups. The NSF will help staff and liaise to related activities of the National Science Board. Personnel from the NSF Policy Office (Office of Budget and Finance and Award Management, Division of Institutional and Award Support) and NSF Information Systems (Office of Information and Resource Management, Division of Information Systems) will be required for working groups and implementation. Other research agencies, such as the National Institutes of Health (NIH) and the Department of Energy (DOE), face similar challenges with regard to products of federally-funded scientific research. NSF staff communicate with representatives of other agencies to identify best practices for facilitating access to data. Prototyping and implementation will engage a variety of external stakeholders.  Digital data are increasingly one of the primary products of scientific research. As advanced by the National Science Board, open data sharing is closely linked to public access to scholarly publications resulting from federally-funded unclassified research, and they should be considered in concert. The digital data underlying figures and the key findings in the literature should be accessible and linked to one another so that scientists can verify and reproduce major findings in the literature and repurpose the data to enable new discoveries. Simultaneously, access to digital products of research enhances openness and transparency in the scientific enterprise and enables new types of multi-disciplinary research and education. Therefore, it is increasingly important for NSF to facilitate and encourage access to data and research results. The priority goal supports this vision of increasingly collaborative and multi-disciplinary science by assuring that knowledge and data can flow easily across traditional disciplinary boundaries. This goal is also linked to the National Science Foundation’s concept for “OneNSF”, which promotes collaboration in well-integrated and efficient ways across organizational and disciplinary boundaries. The effort is led by the Assistant Director of the Mathematical and Physical Sciences Directorate. All programmatic directorates and offices will provide appropriate program staff for working groups. The NSF will help staff and liaise to related activities of the National Science Board. Personnel from the NSF Policy Office (Office of Budget and Finance and Award Management, Division of Institutional and Award Support) and NSF Information Systems (Office of Information and Resource Management, Division of Information Systems) will be required for working groups and implementation. Other research agencies, such as the National Institutes of Health (NIH) and the Department of Energy (DOE), face similar challenges with regard to products of federally-funded scientific research. NSF staff communicate with representatives of other agencies to identify best practices for facilitating access to data. Prototyping and implementation will engage a variety of external stakeholders. 
Potential Methods and Processes Action Plan will be posted on performance.gov in Summer 2012. Action Plan will be posted on performance.gov in Summer 2012.
Trend Information NA. This Priority Goal is a newly developed activity for FY 2012-FY 2013. The National Science Foundation (NSF) has aligned its Priority Goal on Access to Digital Products with the open data policy process being led by OSTP. NSF participates in OSTP's Open Data Policy initiative by serving on the National Science and Technology Council (NSTC)'s Interagency Working Group on Digital Data (IWGDD). The IWGDD is tasked with identifying the specific objectives and public interests that need to be addressed by any policies in this area. The working group includes representatives from the Department of Energy, the National Institutes of Health, and other science funding agencies as well as NSF. The group is assessing the varying missions, types of data, standards, and dissemination models associated with the range of Federal science agencies and scientific disciplines, and will help OSTP address other public access requirements in the COMPETES Act. The two NSF representatives on the IWGDD also serve on the NSF-wide group on data and access, and facilitate regular communications among these two groups, NSF leadership, and the National Science Board (particularly the Task Force on Data Policies) in order to ensure that the Foundation's activities to facilitate increased accessed to digital products of federally funded research are aligned with those led by OSTP. NSF’s activities for the Priority Goal will take into account the processes and short-term and intermediate-term outcomes of the IWGDD. NA. This Priority Goal is a newly developed activity for FY 2012-FY 2013. The National Science Foundation (NSF) has aligned its Priority Goal on Access to Digital Products with the open data policy process being led by OSTP. NSF participates in OSTP's Open Data Policy initiative by serving on the National Science and Technology Council (NSTC)'s Interagency Working Group on Digital Data (IWGDD). The IWGDD is tasked with identifying the specific objectives and public interests that need to be addressed by any policies in this area. The working group includes representatives from the Department of Energy, the National Institutes of Health, and other science funding agencies as well as NSF. The group is assessing the varying missions, types of data, standards, and dissemination models associated with the range of Federal science agencies and scientific disciplines, and will help OSTP address other public access requirements in the COMPETES Act. The two NSF representatives on the IWGDD also serve on the NSF-wide group on data and access, and facilitate regular communications among these two groups, NSF leadership, and the National Science Board (particularly the Task Force on Data Policies) in order to ensure that the Foundation's activities to facilitate increased accessed to digital products of federally funded research are aligned with those led by OSTP. NSF’s activities for the Priority Goal will take into account the processes and short-term and intermediate-term outcomes of the IWGDD.
Lead Organization Directorate for Mathematics and Physical Sciences Directorate for Mathematics and Physical Sciences

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-1

Make investments that lead to results and resources that are useful to society.

Strategic Target

NSF investments underpin long-term solutions to societal challenges such as economic development, climate change, energy, and cyber-security.

Goal I-1.1 PRIORITY GOAL: Innovation Corps
Table 11
Fiscal Year 2012 2013
Statement Increase the number of entrepreneurs emerging from university laboratories. Increase the number of entrepreneurs emerging from university laboratories.
Target Measure, Milestone, or Deliverable By September 30, 2013, 80 percent of teams participating in the Innovation Corps program will have tested the commercial viability of their product or service. By September 30, 2013, 80 percent of teams participating in the Innovation Corps program will have tested the commercial viability of their product or service.
Explanation Through the Innovation Corps (I-Corps) program, NSF seeks to accelerate the development of new technologies, products and processes that arise from fundamental research. The goals of I-Corps are to spur translation of fundamental research, to encourage collaboration between academia and industry, and to train students to understand innovation and entrepreneurship. With I-Corps, NSF supports NSF-funded researchers whose efforts will be augmented - in the form of mentoring and funding - to accelerate the translation of knowledge derived from fundamental research into emerging products and services that can attract subsequent third party funding. NSF investments will strategically strengthen the innovation ecosystem (http://www.nsf.gov/eng/iip/innovation.pdf) by addressing the challenges inherent in the early stages of the innovation process. Implementation of this goal will require the development of mechanisms and partnerships to support I-Corps teams. The teams will consist of three people and will include an NSF Principal Investigator (PI), an Entrepreneurial Lead (typically a graduate student) and a mentor from the private sector. Teams will propose a specific service or product concept for six months of I-Corps-funded development and testing. The concepts will have arisen from basic research led by the PI and funded by NSF within the previous five years. The selected teams will proceed through an intensive prescribed curriculum designed to develop their entrepreneurial skills and to ensure that the critical assumptions underpinning their perceived opportunities are tested. At the end of the six months, teams will decide whether or not to proceed with development of their concept. Through the Innovation Corps (I-Corps) program, NSF seeks to accelerate the development of new technologies, products and processes that arise from fundamental research. The goals of I-Corps are to spur translation of fundamental research, to encourage collaboration between academia and industry, and to train students to understand innovation and entrepreneurship. With I-Corps, NSF supports NSF-funded researchers whose efforts will be augmented - in the form of mentoring and funding - to accelerate the translation of knowledge derived from fundamental research into emerging products and services that can attract subsequent third party funding. NSF investments will strategically strengthen the innovation ecosystem (http://www.nsf.gov/eng/iip/innovation.pdf) by addressing the challenges inherent in the early stages of the innovation process. Implementation of this goal will require the development of mechanisms and partnerships to support I-Corps teams. The teams will consist of three people and will include an NSF Principal Investigator (PI), an Entrepreneurial Lead (typically a graduate student) and a mentor from the private sector. Teams will propose a specific service or product concept for six months of I-Corps-funded development and testing. The concepts will have arisen from basic research led by the PI and funded by NSF within the previous five years. The selected teams will proceed through an intensive prescribed curriculum designed to develop their entrepreneurial skills and to ensure that the critical assumptions underpinning their perceived opportunities are tested. At the end of the six months, teams will decide whether or not to proceed with development of their concept.
Potential Methods and Processes Action Plan will be posted on performance.gov in Summer 2012. Action Plan will be posted on performance.gov in Summer 2012.
Trend information NA. The Innovation Corps program began in the fourth quarter of FY 2011. NA. The Innovation Corps program began in the fourth quarter of FY 2011.
Lead Organizations Office of Integrative Activities and Directorate for Engineering Office of Integrative Activities and Directorate for Engineering

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-1

Make investments that lead to results and resources that are useful to society.

Strategic Target

NSF investments underpin long-term solutions to societal challenges such as economic development, climate change, energy, and cyber-security.

Goal I-1.2 Industrial and Innovation Partnerships
Table 12
Fiscal Year 2012 2013
Statement Identify the number and types of partnerships entered into by Industrial & Innovation Partnerships (IIP) Division grantees. Identify the number and types of partnerships entered into by Industrial & Innovation Partnerships (IIP) Division grantees.
Target Measure, Milestone, or Deliverable • Count number of financial partnerships in FY 2010 and FY 2011 made by IIP program grantees. • Evaluate the potential to collect other types of partnership data in the future. (e.g. strategic, people partnerships, in-kind partnerships, lab sharing, acquisitions, etc.) • Count number of financial partnerships in FY 2012 made by IIP program grantees.
Explanation The IIP programs are: • Small Business Innovation Research (SBIR) • Small Business Technology Transfer (STTR) • Industry/University Cooperative Research Centers (I/UCRC) • Partnerships for Innovation (PFI) • Grant Opportunities for Academic Liaison with Industry (GOALI) “Partnership” here includes only "financial investments" for the purpose of baselining all IIP Programs. Examples of a financial investment would include: • Subcontractor in SBIR Award • Executed third party investment package in SBIR supplement (required for award) • Partnership condition in award (e.g. GOALI, PFI, STTR, SBIR: Phase IICC, Phase IIA, TECP) • I/UCRC Industrial Advisory Board Member • I/UCRC Interagency Agreement and Military Interdepartmental Purchase Requests (MIPRs) The IIP programs are: • Small Business Innovation Research (SBIR) • Small Business Technology Transfer (STTR) • Industry/University Cooperative Research Centers (I/UCRC) • Partnerships for Innovation (PFI) • Grant Opportunities for Academic Liaison with Industry (GOALI) “Partnership” here includes only "financial investments" for the purpose of baselining all IIP Programs. Examples of a financial investment would include: • Subcontractor in SBIR Award • Executed third party investment package in SBIR supplement (required for award) • Partnership condition in award (e.g. GOALI, PFI, STTR, SBIR: Phase IICC, Phase IIA, TECP) • I/UCRC Industrial Advisory Board Member • I/UCRC Interagency Agreement and Military Interdepartmental Purchase Requests (MIPRs)
Potential Methods and Processes Development of a sound methodology and collection mechanism that if approved (e.g. a final report template) would be used starting with FY 2013 awards. Development of a sound methodology and collection mechanism that if approved (e.g. a final report template) would be used starting with FY 2013 awards.
Trend information This was a new goal in FY 2011 under NSF’s new strategic plan framework. A baseline of FY 2010 partnerships (1,567) was determined in FY 2011 in three of the IIP division’s five programs: SBIR/STTR, PFI and I/UCRC. However, due to changes in data collection systems and clarification of the “partnership” definition, FY 2010 will be re-baselined in FY 2012. This was a new goal in FY 2011 under NSF’s new strategic plan framework. A baseline of FY 2010 partnerships (1,567) was determined in FY 2011 in three of the IIP division’s five programs: SBIR/STTR, PFI and I/UCRC. However, due to changes in data collection systems and clarification of the “partnership” definition, FY 2010 will be re-baselined in FY 2012.
Lead Organization Industrial & Innovation Partnerships Division, Directorate for Engineering Industrial & Innovation Partnerships Division, Directorate for Engineering

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-2

Build the capacity of the nation’s citizenry for addressing societal challenges through science and engineering.

Strategic Target

NSF’s scientific literacy and public engagement programs are supported by rigorous evidence about learning outcomes

Goal I-2.1 Public Understanding and Communication of Science and Engineering
Table 13
Fiscal Year 2012 2013
Statement Establish a common set of evidentiary standards for programs and activities across the agency that fund public understanding and communication of science and engineering activities. Establish a common set of evidentiary standards for programs and activities across the agency that fund public understanding and communication of science and engineering activities.
Target Measure, Milestone, or Deliverable By September 30, 2012, • Deliver an internal report defining standards of evidence for the models used by the 16 programs identified in FY 2011 that fund public understanding and communication of science and engineering. • Identify all programs across the agency that employ the models and strategies. By September 30, 2013, utilize report to inform the revision of solicitation language in one-half of programs identified in FY 2012 to reflect evidence standards 
Explanation Certain programs in EHR’s Division of Research and Learning (DRL) aim to address public understanding and communication of science and engineering, but other NSF activities also work towards this aim. This Goal’s intent is to identify all such activities across the Foundation and provide them with evidence-based criteria for evaluation of such projects. This can lead to more consistent expectations across NSF for use of and production of evidence. Three-year trajectory: search NSF core programs to identify programs that explicitly address Public Understanding and Communication; extract models and evidence strategies used by those programs and reassess NSF’s investments to identify those that implicitly address Public Understanding and Communication; use this list of programs to establish common criteria for Public Understanding and Communication activities in general across programs. Certain programs in EHR’s Division of Research and Learning (DRL) aim to address public understanding and communication of science and engineering, but other NSF activities also work towards this aim. This Goal’s intent is to identify all such activities across the Foundation and provide them with evidence-based criteria for evaluation of such projects. This can lead to more consistent expectations across NSF for use of and production of evidence. Three-year trajectory: search NSF core programs to identify programs that explicitly address Public Understanding and Communication; extract models and evidence strategies used by those programs and reassess NSF’s investments to identify those that implicitly address Public Understanding and Communication; use this list of programs to establish common criteria for Public Understanding and Communication activities in general across programs.
Potential Methods and Processes EHR staff will collect, search, and analyze NSF data and information; NSF-wide group of program staff to be convened and solicitations revised. This effort will also include collaboration with the Office of Legislative and Public Affairs (OLPA). EHR staff will collect, search, and analyze NSF data and information; NSF-wide group of program staff to be convened and solicitations revised. This effort will also include collaboration with the Office of Legislative and Public Affairs (OLPA).
Trend information This was a new goal in FY 2011 under NSF’s new strategic plan framework. In FY 2011, a keyword search of NSF program solicitations using the term “public understanding” was used to generate a baseline. Sixteen programs had this phrase in their solicitations. Searches performed using awarded project descriptions and abstracts identified a number of projects that include the focus in ways other than specifically mentioned as part of a formal program solicitation. Consequently, a baseline based on those numbers was not used because the context and definition of the phrase “public understanding” in funded project descriptions varied. This was a new goal in FY 2011 under NSF’s new strategic plan framework. In FY 2011, a keyword search of NSF program solicitations using the term “public understanding” was used to generate a baseline. Sixteen programs had this phrase in their solicitations. Searches performed using awarded project descriptions and abstracts identified a number of projects that include the focus in ways other than specifically mentioned as part of a formal program solicitation. Consequently, a baseline based on those numbers was not used because the context and definition of the phrase “public understanding” in funded project descriptions varied.
Lead Organization Division of Research on Learning in Formal and Informal Settings (DRL), EHR Division of Research on Learning in Formal and Informal Settings (DRL), EHR

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-2

Build the capacity of the nation’s citizenry for addressing societal challenges through science and engineering.

Strategic Target

NSF’s K-12 STEM education investments are designed and tested for scale-up.

Goal I-2.2 K-12 Components
Table 14
Fiscal Year 2012 2013
Statement Establish a common set of evidentiary standards for programs across the agency that fund activities with K-12 components. Establish a common set of evidentiary standards for programs across the agency that fund activities with K-12 components.
Target Measure, Milestone, or Deliverable By September 30, 2012, • Identify the number of programs that fund activities with K-12 components in FY 2012. • Develop common standards of evidence for inclusion in future solicitations of the identified programs.  By September 30, 2013, • 100 percent of programs identified in FY 2012 (“the portfolio”) will include the common standards in their solicitations. • A baseline count will be taken of the projects in the portfolio that already meet these standards.
Explanation There is increasing interest across the federal government not just to count the number of programs addressing K-12 education, but to examine the potential of projects for “going to scale”: moving beyond the initial project site to be adapted and implemented successfully under more representative conditions and with appropriate population groups. There are multiple sets of standards for identifying a project’s readiness for scale-up. Sources for standards of evidence that will be examined in FY 2012 include those under development at the Department of Education and NSF, individual NSF programs (e.g. Arctic Sciences rigor standards; DRK-12 Horizon Research Standards), and National Academies of Science reports. There is increasing interest across the federal government not just to count the number of programs addressing K-12 education, but to examine the potential of projects for “going to scale”: moving beyond the initial project site to be adapted and implemented successfully under more representative conditions and with appropriate population groups. There are multiple sets of standards for identifying a project’s readiness for scale-up. Sources for standards of evidence that will be examined in FY 2012 include those under development at the Department of Education and NSF, individual NSF programs (e.g. Arctic Sciences rigor standards; DRK-12 Horizon Research Standards), and National Academies of Science reports.
Potential Methods and Processes NSF expert staff will review standards of evidence in the literature, repeat and revise the FY 2011 analysis, update solicitations, and write reports. A baseline count of the number of projects that meet the standards will be conducted in FY 2013. It is estimated that approximately 15 percent of the projects in the portfolio will already meet the standards. NSF expert staff will review standards of evidence in the literature, repeat and revise the FY 2011 analysis, update solicitations, and write reports. A baseline count of the number of projects that meet the standards will be conducted in FY 2013. It is estimated that approximately 15 percent of the projects in the portfolio will already meet the standards.
Trend information This was a new goal in FY 2011 under NSF’s new strategic plan framework. A baseline of NSF’s K-12 programs (FY 2011 baseline: 16) was established as the first step in a three-year trajectory to establish a set of standards in common across NSF to articulate a pathway toward readiness to scale up. This was a new goal in FY 2011 under NSF’s new strategic plan framework. A baseline of NSF’s K-12 programs (FY 2011 baseline: 16) was established as the first step in a three-year trajectory to establish a set of standards in common across NSF to articulate a pathway toward readiness to scale up.
Lead Organization Directorate for Education and Human Resources Directorate for Education and Human Resources

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-3

Support the development of innovative learning systems.

Strategic Target

NSF invests in innovative learning tools and structures that use emerging technologies and are tested for effectiveness and scalability.

Goal I-3.1 Innovative Learning Systems
Table 15
Fiscal Year 2012 2013
Statement Integrate common language about, or goals for, innovative learning research into the Cyberlearning, Data and Observation for STEM Education focus area of the Expeditions in Education (E2) investment, and into other programs across the agency that fund innovative learning tools, structures, and systems. Integrate common language about, or goals for, innovative learning research into the Cyberlearning, Data and Observation for STEM Education focus area of the Expeditions in Education (E2) investment, and into other programs across the agency that fund innovative learning tools, structures, and systems.
Target Measure, Milestone, or Deliverable By September 30, 2012, write a synthesis report on NSF support of Innovative Learning Systems supporting common language for solicitations. By September 30, 2013, • Programs with significant innovative learning system research will update their solicitations with the language developed in FY 2012 to include common language or goals about innovative learning systems. • At least 50 percent of new projects funded in the innovative learning systems portfolio have in place research and evaluation mechanisms that will provide high quality evidence about the nature of student learning.
Explanation Networked computing and communications technologies that support learning, teaching, and education are already opening up access for all learners, in all age groups, in all settings. Innovative learning systems can bring authentic scientific data immediately to learners, which enable learners to experience science through modeling, simulation, sensor networks, digital telescopes and remote instruments. Expeditions in Education (E2) is an NSF initiative to infuse cutting-edge science, engineering, and innovation into the preparation of a world-class scientific workforce for the twenty-first century, and to ensure that all of NSF’s education and workforce investments are drawing on the latest educational theory, research, and evidence. E2 activities will integrate, leverage, and expand STEM education research and development to improve learning in science and engineering (S&E) disciplines and capitalize on the scientific assets across NSF to bring engaging new science content, knowledge, and real-world applications to more learners. The “Cyberlearning, Data, and Observations for STEM Education” focus area of E2 aims to address the many questions associated with how STEM learning can be enhanced and how new content can be introduced using cyberlearning resources and tools. Networked computing and communications technologies that support learning, teaching, and education are already opening up access for all learners, in all age groups, in all settings. Innovative learning systems can bring authentic scientific data immediately to learners, which enable learners to experience science through modeling, simulation, sensor networks, digital telescopes and remote instruments. Expeditions in Education (E2) is an NSF initiative to infuse cutting-edge science, engineering, and innovation into the preparation of a world-class scientific workforce for the twenty-first century, and to ensure that all of NSF’s education and workforce investments are drawing on the latest educational theory, research, and evidence. E2 activities will integrate, leverage, and expand STEM education research and development to improve learning in science and engineering (S&E) disciplines and capitalize on the scientific assets across NSF to bring engaging new science content, knowledge, and real-world applications to more learners. The “Cyberlearning, Data, and Observations for STEM Education” focus area of E2 aims to address the many questions associated with how STEM learning can be enhanced and how new content can be introduced using cyberlearning resources and tools.
Potential Methods and Processes NSF staff will review standards of evidence in the literature, confer with experts in other agencies and the field, repeat and revise the FY 2011 analysis, update solicitations, and write reports. NSF staff will review standards of evidence in the literature, confer with experts in other agencies and the field, repeat and revise the FY 2011 analysis, update solicitations, and write reports.
Trend information This was a new goal in FY 2011 under NSF’s new strategic plan framework. Its intent was to identify activities across the Foundation that contribute to development of innovative learning systems, which are not funded by any one program. In FY 2011, a latent semantic analysis tool was used to find awards made in FY 2011 that fit into the general category of Research-Based Innovative learning Systems (ILS). 150 awards were identified, 95 percent of which were made by eight divisions within the EHR, CISE, and ENG directorates. The awards corresponded to 28 distinct programs. This was a new goal in FY 2011 under NSF’s new strategic plan framework. Its intent was to identify activities across the Foundation that contribute to development of innovative learning systems, which are not funded by any one program. In FY 2011, a latent semantic analysis tool was used to find awards made in FY 2011 that fit into the general category of Research-Based Innovative learning Systems (ILS). 150 awards were identified, 95 percent of which were made by eight divisions within the EHR, CISE, and ENG directorates. The awards corresponded to 28 distinct programs.
Lead Organization Directorate for Education and Human Resources Directorate for Education and Human Resources

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-1

Achieve management excellence through leadership, accountability, and personal responsibility.

Strategic Target

More effective management enables all staff to understand how their duties support the mission of the Foundation.

Goal M-1.1 Model EEO Agency
Table 16
Fiscal Year 2012 2013
Statement Perform activities necessary to attain essential elements of a model EEO agency, as defined by the Equal Employment Opportunity Commission (EEOC). Collaborate with the Chief Human Capital Officer (CHCO) in drafting the Office of Diversity and Inclusion’s responsibilities within NSF’s first Diversity and Inclusion (D&I) Strategic Plan for submission to the Office of Personnel Management (OPM). Perform activities necessary to attain essential elements of a model EEO agency, as defined by the Equal Employment Opportunity Commission (EEOC). Collaborate with the Chief Human Capital Officer (CHCO) in drafting the Office of Diversity and Inclusion’s responsibilities within NSF’s first Diversity and Inclusion (D&I) Strategic Plan for submission to the Office of Personnel Management (OPM).
Target Measure, Milestone, or Deliverable • Attain four of six essential elements. • Submit D&I Strategic Plan to OPM by March 30, 2012. • Attain five of six essential elements.
Explanation For NSF to achieve model EEO agency status, it must meet and maintain each of the six criteria established by the EEOC. The EEOC refers to these criteria as the “Essential Elements” of a Model Agency, which are: A. Demonstrated commitment from agency leadership; B. Integration of EEO into the agency's strategic mission; C. Management and program accountability; D. Proactive prevention of unlawful discrimination; E. Efficiency; and F. Responsiveness and legal compliance. Per Executive Order 13583, which establishes a coordinated government-wide initiative to promote diversity and inclusion in the federal workforce, NSF will submit a D&I Strategic Plan to OPM in FY 2012. Specifically, the Office of Diversity and Inclusion’s (ODI’s) focus in the D&I plan will include, after review of the required barrier analysis, action plans to eliminate any identified barriers and implementation progress for at least two NSF directorates. For NSF to achieve model EEO agency status, it must meet and maintain each of the six criteria established by the EEOC. The EEOC refers to these criteria as the “Essential Elements” of a Model Agency, which are: A. Demonstrated commitment from agency leadership; B. Integration of EEO into the agency's strategic mission; C. Management and program accountability; D. Proactive prevention of unlawful discrimination; E. Efficiency; and F. Responsiveness and legal compliance. Per Executive Order 13583, which establishes a coordinated government-wide initiative to promote diversity and inclusion in the federal workforce, NSF will submit a D&I Strategic Plan to OPM in FY 2012. Specifically, the Office of Diversity and Inclusion’s (ODI’s) focus in the D&I plan will include, after review of the required barrier analysis, action plans to eliminate any identified barriers and implementation progress for at least two NSF directorates.
Potential Methods and Processes To evaluate NSF’s progress towards meeting measures in the essential elements of a model EEO agency, ODI will conduct an annual self assessment, as required by the EEOC. Such assessment is certified for accuracy by both the ODI and NSF Director. At the end of each fiscal year, ODI will provide to the verifiers a narrative illustrating NSF’s progress, based on the certified self assessment, as well as provide data, such as training, and any supporting documentation that is not protected under privacy laws. In evaluating targets regarding NSF’s D&I Strategic Plan, the following will be measured: 1. Whether the D&I Plan was timely submitted to OPM. 2. Whether ODI, upon review of its annual barrier analysis, identified any potential barriers to EEO, worked with applicable senior leaders in at least two directorates in devising plans to eliminate any barriers, and provided an assessment of the plan for effectiveness. To evaluate NSF’s progress towards meeting measures in the essential elements of a model EEO agency, ODI will conduct an annual self assessment, as required by the EEOC. Such assessment is certified for accuracy by both the ODI and NSF Director. At the end of each fiscal year, ODI will provide to the verifiers a narrative illustrating NSF’s progress, based on the certified self assessment, as well as provide data, such as training, and any supporting documentation that is not protected under privacy laws. In evaluating targets regarding NSF’s D&I Strategic Plan, the following will be measured: 1. Whether the D&I Plan was timely submitted to OPM. 2. Whether ODI, upon review of its annual barrier analysis, identified any potential barriers to EEO, worked with applicable senior leaders in at least two directorates in devising plans to eliminate any barriers, and provided an assessment of the plan for effectiveness.
Trend information In FY 2011, the first year of this performance goal, model EEO agency criteria A (Demonstrated commitment from agency leadership), B (Integration of EEO into the agency's strategic mission), and E (Efficiency) were achieved. In FY 2011, the first year of this performance goal, model EEO agency criteria A (Demonstrated commitment from agency leadership), B (Integration of EEO into the agency's strategic mission), and E (Efficiency) were achieved.
Lead Organization Office of Diversity and Inclusion, Office of the Director Office of Diversity and Inclusion, Office of the Director

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-1

Achieve management excellence through leadership, accountability, and personal responsibility.

Strategic Target

More effective management enables all staff to understand how their duties support the mission of the Foundation.

Goal M-1.2 IPA Performance Plans
Table 17
Fiscal Year 2012 2013
Statement Include assignees on temporary appointment to NSF under the Intergovernmental Personnel Act (IPAs) under an NSF performance management system. Include assignees on temporary appointment to NSF under the Intergovernmental Personnel Act (IPAs) under an NSF performance management system.
Target Measure, Milestone, or Deliverable By March 31, 2012, 95 percent of executive-level IPAs whose assignments have at least 90 days remaining will have performance plans in place. By September 30, 2012, 90 percent of non-executive IPAs whose assignments have at least 90 days remaining will have performance plans in place. By March 31, 2013, 100 percent of executive IPAs with appointments exceeding 90 days will have performance plans in place. By September 30, 2013, 95 percent of all non executive IPAs whose assignments have at least 90 days remaining will have performance plans in place. By October 1, 2013, an evaluation of the effectiveness of executive and non-executive IPA performance plans in setting and communicating expectations will be completed. By October 31, 2013, best practices for managing executive and non-executive IPA performance will be identified and shared.
Explanation This goal addresses human resource management challenges specific to NSF that were identified by Congress, the Office of Personnel Management, and NSF’s Office of the Inspector General. The Intergovernmental Personnel Act (IPA) mobility program (5 CFR part 334) provides the authority for NSF to bring in scientific staff for limited periods of time. IPA assignees are on detail to NSF and remain on the payroll of their home institution. Using the IPA authority to recruit active researchers infuses new talent and expertise into NSF and provides scientists and engineers with valuable information and knowledge to bring back to their home institutions. NSF’s use of the IPA helps to maintain the Foundation’s close association with the nation’s colleges and universities and the contributions made by NSF’s IPA scientists furthers the agency’s mission of supporting the entire spectrum of science and engineering research and education. This goal addresses human resource management challenges specific to NSF that were identified by Congress, the Office of Personnel Management, and NSF’s Office of the Inspector General. The Intergovernmental Personnel Act (IPA) mobility program (5 CFR part 334) provides the authority for NSF to bring in scientific staff for limited periods of time. IPA assignees are on detail to NSF and remain on the payroll of their home institution. Using the IPA authority to recruit active researchers infuses new talent and expertise into NSF and provides scientists and engineers with valuable information and knowledge to bring back to their home institutions. NSF’s use of the IPA helps to maintain the Foundation’s close association with the nation’s colleges and universities and the contributions made by NSF’s IPA scientists furthers the agency’s mission of supporting the entire spectrum of science and engineering research and education.
Potential Methods and Processes In order to facilitate tracking and documentation, HRM will seek to develop an electronic process for submitting and tracking IPA performance plans and appraisals. HRM will conduct interviews, focus groups, and/or surveys with IPAs and their supervisors to determine the impact of these performance plans on IPAs’ understandings of what is expected of them and their ability to support NSF’s mission. In order to facilitate tracking and documentation, HRM will seek to develop an electronic process for submitting and tracking IPA performance plans and appraisals. HRM will conduct interviews, focus groups, and/or surveys with IPAs and their supervisors to determine the impact of these performance plans on IPAs’ understandings of what is expected of them and their ability to support NSF’s mission.
Trend information Before FY 2011, IPAs were not required to submit performance plans. In FY 2011, a performance goal to expand the coverage of NSF’s performance management framework to include IPAs was set. In the first year, 92 percent of all non-executive IPAs (target: 80 percent) and 90 percent of executive-level IPAs (target: 90 percent) had performance plans on file. Before FY 2011, IPAs were not required to submit performance plans. In FY 2011, a performance goal to expand the coverage of NSF’s performance management framework to include IPAs was set. In the first year, 92 percent of all non-executive IPAs (target: 80 percent) and 90 percent of executive-level IPAs (target: 90 percent) had performance plans on file.
Lead Organization Division of Human Resources Management (HRM), Office of Information and Resource Management (OIRM) Division of Human Resources Management (HRM), Office of Information and Resource Management (OIRM)

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-1

Achieve management excellence through leadership, accountability, and personal responsibility.

Strategic Target

More effective management enables all staff to understand how their duties support the mission of the Foundation.

Goal M-1.3 Performance Management System
Table 18
Fiscal Year 2012 2013
Statement Use findings from assessments to guide improvement of NSF’s employee performance management systems. Acronyms: • CHCO: Chief Human Capital Officer • EVS: Employee View Point Survey • GWF: General Workforce • PAAT: Performance Appraisal Assessment Tool • SES: Senior Executive Service Use findings from assessments to guide improvement of NSF’s employee performance management systems. Acronyms: • CHCO: Chief Human Capital Officer • EVS: Employee View Point Survey • GWF: General Workforce • PAAT: Performance Appraisal Assessment Tool • SES: Senior Executive Service
Target Measure, Milestone, or Deliverable By September 30, 2012, deliver an action strategy for improvement of one to three areas noted in NSF’s SES or GWF PAAT or identified in NSF’s EVS results to the NSF CHCO. By July 31, 2013, • Submit 2013 NSF SES PAAT to OPM. • Put in place the needed supporting materials for full implementation of the government-wide SES Performance Plan and Appraisal Process. By September 30, 2013, achieve a 65 percent positive response rate on the 2012 EVS to the question: “In my most recent performance appraisal, I understood what I had to do to be rated at different performance levels (for example, Fully Successful, Outstanding).”
Explanation NSF has two primary performance management systems for NSF employees, one that covers members of the Senior Executive Service (SES) and one that covers the General Workforce (GWF). In 2011 NSF added a third performance system to cover staff on assignment to NSF through the Intergovernmental Personnel Act (IPA) mobility program. Staff under this third system are covered by goal M-1.2. In 2011, NSF administered OPM’s Performance Appraisal Assessment Tool (PAAT) for both the SES and GWF performance management systems. The SES PAAT was submitted to OPM in September 2011, and NSF’s SES performance management system was certified in January 2012. NSF can use the OPM review materials and internal review to identify potential areas of weakness and to develop a strategy for improving the SES performance management system in conjunction with a new government-wide approach to SES performance management. The GWF PAAT was submitted to OPM in December 2011. It is still under review at OPM. Feedback from OPM will be incorporated with related internal review processes to develop a strategy for improving the GWF performance management system. The Federal EVS is a tool that measures employees' perceptions of whether, and to what extent, the conditions that characterize successful organizations are present in their agencies. The EVS includes questions related to performance appraisal. This goal addresses human resource management challenges specific to NSF that were identified by Congress, the Office of Personnel Management, and NSF’s Office of the Inspector General. NSF has two primary performance management systems for NSF employees, one that covers members of the Senior Executive Service (SES) and one that covers the General Workforce (GWF). In 2011 NSF added a third performance system to cover staff on assignment to NSF through the Intergovernmental Personnel Act (IPA) mobility program. Staff under this third system are covered by goal M-1.2. In 2011, NSF administered OPM’s Performance Appraisal Assessment Tool (PAAT) for both the SES and GWF performance management systems. The SES PAAT was submitted to OPM in September 2011, and NSF’s SES performance management system was certified in January 2012. NSF can use the OPM review materials and internal review to identify potential areas of weakness and to develop a strategy for improving the SES performance management system in conjunction with a new government-wide approach to SES performance management. The GWF PAAT was submitted to OPM in December 2011. It is still under review at OPM. Feedback from OPM will be incorporated with related internal review processes to develop a strategy for improving the GWF performance management system. The Federal EVS is a tool that measures employees' perceptions of whether, and to what extent, the conditions that characterize successful organizations are present in their agencies. The EVS includes questions related to performance appraisal. This goal addresses human resource management challenges specific to NSF that were identified by Congress, the Office of Personnel Management, and NSF’s Office of the Inspector General.
Potential Methods and Processes HRM will: • Analyze the findings of SES PAAT, the GWF PAAT, and the EVS. • Partner with others to benchmark against and identify promising practices in other organizations to assist NSF in addressing priority areas. • Develop action strategy for implementation in FY 2013. HRM will: • Analyze the findings of SES PAAT, the GWF PAAT, and the EVS. • Partner with others to benchmark against and identify promising practices in other organizations to assist NSF in addressing priority areas. • Develop action strategy for implementation in FY 2013.
Trend information The NSF’s most recent SES-PAAT Assessment Report (released in December of 2011) identified a need for the NSF to establish guidelines about how organizational performance should be considered when deciding ratings and awards and to develop a plan for setting and adjusting SES rate of basic pay. • On the question related to organizational assessment guidelines, the NSF scored 4 out of 6 because the NSF did not provide written guidelines about how organizational performance should be considered when deciding ratings and awards. • On the question related to pay policy, the NSF scored 3 out of 5 because some criteria outlined in 5 CFR 534.404(g) were missing. NSF is still awaiting OPM’s review of the GWF PAAT. The 2011 EVS found that the percentage of NSF employees who understood what they had to do to be rated at different performance levels was lower than in previous years. For the EVS question “In my most recent performance appraisal, I understood what I had to do to be rated at different performance levels (for example, Fully Successful, Outstanding)”: 2010 EVS positive response rate: 68 percent. 2011 EVS positive response rate: 63 percent. 2012 EVS target: 65 percent. 2013 EVS target: 68 percent. A given year’s EVS results are made available approximately six months following the survey. Thus, the 2012 EVS results will be available in late FY 2012 or early FY 2013, and the 2013 results will be available in late FY 2013 or early FY 2014. The NSF’s most recent SES-PAAT Assessment Report (released in December of 2011) identified a need for the NSF to establish guidelines about how organizational performance should be considered when deciding ratings and awards and to develop a plan for setting and adjusting SES rate of basic pay. • On the question related to organizational assessment guidelines, the NSF scored 4 out of 6 because the NSF did not provide written guidelines about how organizational performance should be considered when deciding ratings and awards. • On the question related to pay policy, the NSF scored 3 out of 5 because some criteria outlined in 5 CFR 534.404(g) were missing. NSF is still awaiting OPM’s review of the GWF PAAT. The 2011 EVS found that the percentage of NSF employees who understood what they had to do to be rated at different performance levels was lower than in previous years. For the EVS question “In my most recent performance appraisal, I understood what I had to do to be rated at different performance levels (for example, Fully Successful, Outstanding)”: 2010 EVS positive response rate: 68 percent. 2011 EVS positive response rate: 63 percent. 2012 EVS target: 65 percent. 2013 EVS target: 68 percent. A given year’s EVS results are made available approximately six months following the survey. Thus, the 2012 EVS results will be available in late FY 2012 or early FY 2013, and the 2013 results will be available in late FY 2013 or early FY 2014.
Lead Organization Division of Human Resources Management (HRM), Office of Information and Resource Management (OIRM) Division of Human Resources Management (HRM), Office of Information and Resource Management (OIRM)

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-2

Infuse learning as an essential element of the NSF culture with emphasis on professional development and personal growth.

Strategic Target

NSF emphasizes learning for personal and professional development for all staff.

Goal M-2.1 Assess Developmental Needs
Table 19
Fiscal Year 2012 2013
Statement Enhance NSF capabilities to provide training of staff for their current positions. Enhance NSF capabilities to provide training of staff for their current positions.
Target Measure, Milestone, or Deliverable By September 30, 2012, design a structured curriculum which meets assessed needs for at least two types of NSF staff roles (e.g. leaders, program officers, administrative professionals, technical professionals). • By September 30, 2013, identify gaps between desired curricula and current course offerings and recommend approaches to filling identified gaps. • Attain a 60 percent positive response rate on the 2013 Employee Viewpoint Survey (EVS) on the question “How satisfied are you with the training you receive for your present job?” (results available in FY 2014)
Explanation NSF core values and strategic goals place a high priority on learning and development for its staff. NSF stresses personal learning and development to enhance performance, further our knowledge base on all aspects of NSF activity, and continue to build for the future. This directly reflects the specific action identified in the Strategic Plan: “review current NSF learning opportunities and develop a plan for addressing gaps.” NSF core values and strategic goals place a high priority on learning and development for its staff. NSF stresses personal learning and development to enhance performance, further our knowledge base on all aspects of NSF activity, and continue to build for the future. This directly reflects the specific action identified in the Strategic Plan: “review current NSF learning opportunities and develop a plan for addressing gaps.”
Potential Methods and Processes HRM will continue to evaluate data from the 2011 and 2012 needs assessments to determine gaps between identified needs and current curricula and course offerings. Needs assessments will be designed to reflect the needs of key elements of the NSF workforce. They will use a combination of survey and focus group methods. Once identified, the gaps will be assigned a priority status and the Academy will recommend options for filling those gaps. Note that training around implementation of performance management systems will be important components of both this goal and the goals regarding improvement in performance management systems. HRM will continue to evaluate data from the 2011 and 2012 needs assessments to determine gaps between identified needs and current curricula and course offerings. Needs assessments will be designed to reflect the needs of key elements of the NSF workforce. They will use a combination of survey and focus group methods. Once identified, the gaps will be assigned a priority status and the Academy will recommend options for filling those gaps. Note that training around implementation of performance management systems will be important components of both this goal and the goals regarding improvement in performance management systems.
Trend information In FY 2011, HRM developed and launched targeted needs analysis questionnaires designed to generate new learning needs data. Questionnaires asked NSF administrative professional staff to rate the performance of specific skills necessary to complete their work, and rated the importance of creating additional skill-based learning and development opportunities to help them successfully complete their work. Approximately 38% of administrative support staff participated in the survey. In September 2011, contract support for assessment of the broader spectrum of NSF staff was obtained. The 2010 and 2011 Employee Viewpoint Surveys found that the number of employees satisfied with the training they received had decreased. For the EVS question “How satisfied are you with the training you receive for your present job?”: 2008 EVS positive response rate: 66 percent. 2010 EVS positive response rate: 61 percent. 2011 EVS positive response rate: 57 percent. 2012 EVS target: 60 percent. 2013 EVS target: 63 percent. A given year’s EVS results are made available approximately six months following the survey. Thus, the 2012 EVS results will be available in late FY 2012 or early FY 2013, and the 2013 results will be available in late FY 2013 or early FY 2014. In FY 2011, HRM developed and launched targeted needs analysis questionnaires designed to generate new learning needs data. Questionnaires asked NSF administrative professional staff to rate the performance of specific skills necessary to complete their work, and rated the importance of creating additional skill-based learning and development opportunities to help them successfully complete their work. Approximately 38% of administrative support staff participated in the survey. In September 2011, contract support for assessment of the broader spectrum of NSF staff was obtained. The 2010 and 2011 Employee Viewpoint Surveys found that the number of employees satisfied with the training they received had decreased. For the EVS question “How satisfied are you with the training you receive for your present job?”: 2008 EVS positive response rate: 66 percent. 2010 EVS positive response rate: 61 percent. 2011 EVS positive response rate: 57 percent. 2012 EVS target: 60 percent. 2013 EVS target: 63 percent. A given year’s EVS results are made available approximately six months following the survey. Thus, the 2012 EVS results will be available in late FY 2012 or early FY 2013, and the 2013 results will be available in late FY 2013 or early FY 2014.
Lead Organization Division of Human Resources Management (HRM), Office of Information and Resource Management (OIRM) Division of Human Resources Management (HRM), Office of Information and Resource Management (OIRM)

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-3

Encourage and sustain a culture of creativity and innovation across the agency to ensure continuous improvement and achieve high levels of customer service.

Strategic Target

NSF uses the innovation and creativity of our staff to improve agency processes and systems on a continuing basis.

Goal M-3.1 Financial System Modernization
Table 20
Fiscal Year 2012 2013
Statement Upgrade NSF’s financial system. Upgrade NSF’s financial system.
Target Measure, Milestone, or Deliverable By September 30, 2012, to support the iTRAK initiative, the Division of Financial Management (DFM) and the Division of Acquisition and Cooperative Agreements (DACS) will award a contract for the iTRAK financial system implementation and integration services. By September 30, 2013, to support the transition to the grant-by-grant payment process known as the Award Cash Management $ervice (ACM$), DFM will reconcile 100 percent of the grantee’s reported cash on hand balances as of December 31, 2012 with NSF’s general ledger.
Explanation “iTRAK” is the Foundation-wide effort to transition NSF from its legacy financial support systems to a fully integrated, commercial-off-the-shelf (COTS) financial management shared services solution. In FY 2012, NSF will select a system integration contractor to implement the COTS solution. NSF is transitioning its financial processing of grants from a pooled system (quarterly reporting of expenditures by institution) to a grant-by-grant payment process where grant funds are requested and reported on an individual grant level.
Potential Methods and Processes The iTRAK Technical Evaluation Panel (TEP) reviews and rates the proposals submitted by vendors. Upon award, a memo from the Contracting Officer Technical Representative (COTR) will be given to the CFO stating that this goal has been met. DFM will reconcile each grantee’s award balance with NSF’s Financial Accounting System (FAS). Upon completion of the reconciliation process, a memo from the Cash Management Branch Chief will be given to the DCFO stating that this goal has been met.
Trend information Financial system modernization efforts have been underway at NSF for several years. The iTRAK effort–a Foundation-wide effort to transition NSF from its legacy financial support systems to a fully integrated, commercial-off-the-shelf (COTS) financial management shared services solution—is central, but other modernization steps are required as prerequisites. GPRA performance goals related to the activities were first developed in FY 2011 to measure one of these prerequisites, when functional requirements were gathered for changes in current system processes that will accommodate the transition to a grant-by-grant payment method. This payment method is a prerequisite for the transition to a COTS financial management shared services solution. The FY 2011 target for this goal was the documentation of functional requirements to transition to a real-time payment method. Documentation detailing business rules and functional requirements was prepared by the Office of Budget, Finance, and Award Management in FY 2011. However, during cross-agency working group meetings, additional requirements were raised that needed to be addressed. Consequently, this target was not met in FY 2011. In fulfillment of the goal, the requirements were delivered in December 2011. Establishing the capability to monitor expenditures at the award level is an essential aspect of NSF’s financial system modernization. NSF is committed to transition its financial processing of grants from a pooled system (quarterly reporting of expenditures by institution) to grant-by-grant (near real-time reporting of expenditures by award) by FY 2013. This change will result in more timely financial data and stronger monitoring programs. This initiative is per the Administration and CFO directive to increase efficiency and transparency in the agency. Financial system modernization efforts have been underway at NSF for several years. The iTRAK effort–a Foundation-wide effort to transition NSF from its legacy financial support systems to a fully integrated, commercial-off-the-shelf (COTS) financial management shared services solution—is central, but other modernization steps are required as prerequisites. GPRA performance goals related to the activities were first developed in FY 2011 to measure one of these prerequisites, when functional requirements were gathered for changes in current system processes that will accommodate the transition to a grant-by-grant payment method. This payment method is a prerequisite for the transition to a COTS financial management shared services solution. The FY 2011 target for this goal was the documentation of functional requirements to transition to a real-time payment method. Documentation detailing business rules and functional requirements was prepared by the Office of Budget, Finance, and Award Management in FY 2011. However, during cross-agency working group meetings, additional requirements were raised that needed to be addressed. Consequently, this target was not met in FY 2011. In fulfillment of the goal, the requirements were delivered in December 2011. Establishing the capability to monitor expenditures at the award level is an essential aspect of NSF’s financial system modernization. NSF is committed to transition its financial processing of grants from a pooled system (quarterly reporting of expenditures by institution) to grant-by-grant (near real-time reporting of expenditures by award) by FY 2013. This change will result in more timely financial data and stronger monitoring programs. This initiative is per the Administration and CFO directive to increase efficiency and transparency in the agency.
Lead Organizations Divisions of Acquisition and Cooperative Agreements (DACS) and Financial Management (DFM), Office of Budget, Finance, and Award Management (BFA). Divisions of Acquisition and Cooperative Agreements (DACS) and Financial Management (DFM), Office of Budget, Finance, and Award Management (BFA).

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-3

Encourage and sustain a culture of creativity and innovation across the agency to ensure continuous improvement and achieve high levels of customer service.

Strategic Target

NSF organizations achieve high levels of customer satisfaction

Goal M-3.2 Time-to-Decision
Table 21
Fiscal Year 2012 2013
Statement Inform applicants whether their proposals have been declined or recommended for funding within six months of deadline, target date, or receipt date, whichever is later. Inform applicants whether their proposals have been declined or recommended for funding within six months of deadline, target date, or receipt date, whichever is later.
Target Measure, Milestone, or Deliverable 70 percent. 70 percent.
Explanation Time-to-decision or “dwell time” is the amount of time that passes between receipt of a proposal and notification to the principal investigator about the funding decision. One of the most significant issues raised in customer satisfaction surveys is the time it takes NSF to process proposals. Too long a time period inhibits the progress of research as it delays the funding process, but too short a time period may inhibit the merit review process. The six-month target seeks to strike a balance between the need of the investigator for timely action and the need of NSF for a credible and efficient merit review system. Time-to-decision or “dwell time” is the amount of time that passes between receipt of a proposal and notification to the principal investigator about the funding decision. One of the most significant issues raised in customer satisfaction surveys is the time it takes NSF to process proposals. Too long a time period inhibits the progress of research as it delays the funding process, but too short a time period may inhibit the merit review process. The six-month target seeks to strike a balance between the need of the investigator for timely action and the need of NSF for a credible and efficient merit review system.
Potential Methods and Processes NSF automates the collection of data and calculation of result for this goal, thereby ensuring the quality of the data. The NSF databases used (FastLane and eJacket) have internal controls that ensure data quality. NSF automates the collection of data and calculation of result for this goal, thereby ensuring the quality of the data. The NSF databases used (FastLane and eJacket) have internal controls that ensure data quality.
Trend information Monitoring the merit review process with the time-to-decision metric is an ongoing practice at NSF. The most relevant recent variations in performance took place in FY 2009 and FY 2010. In the second quarter of FY 2009, the American Recovery and Reinvestment Act (ARRA) was passed. This goal was suspended for the second, third, and fourth quarters of that year to allow for a greater number of proposals to be processed with additional funds from ARRA. The goal was reinstated in FY 2010, when NSF exceeded this goal despite a significant increase in workload. Overall, staffing levels increased by 5.5 percent between FY 2008 and FY 2011, while proposal pressure increased by 17.4 percent. Time to decision performance trends, FY 2006-FY 2011 (See Below) Monitoring the merit review process with the time-to-decision metric is an ongoing practice at NSF. The most relevant recent variations in performance took place in FY 2009 and FY 2010. In the second quarter of FY 2009, the American Recovery and Reinvestment Act (ARRA) was passed. This goal was suspended for the second, third, and fourth quarters of that year to allow for a greater number of proposals to be processed with additional funds from ARRA. The goal was reinstated in FY 2010, when NSF exceeded this goal despite a significant increase in workload. Overall, staffing levels increased by 5.5 percent between FY 2008 and FY 2011, while proposal pressure increased by 17.4 percent. Time to decision performance trends, FY 2006-FY 2011 (See Below)
Lead Organization Office of the Director Office of the Director
Time to decision performance trends, FY 2006-FY 2011
Figure 2

Time to decision performance trends, FY 2006-FY 2011A.png

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-3

Encourage and sustain a culture of creativity and innovation across the agency to ensure continuous improvement and achieve high levels of customer service.

Strategic Target

NSF organizations achieve high levels of customer satisfaction

Goal M-3.3 Virtual Merit Review Panels
Table 22
Fiscal Year 2012 2013
Statement Expand the use of virtual merit review panels. Expand the use of virtual merit review panels.
Target Measure, Milestone, or Deliverable By September 30, 2012, develop guidelines and training modules for NSF staff on the use of virtual merit review panels. In FY 2013, as a pilot activity, five percent of merit review panels will be virtual panels.
Explanation This goal will test ways to make NSF’s merit review process more innovative, sustainable and family-friendly to encourage greater participation and the best use of resources. NSF makes extensive use of panels of reviewers to evaluate proposals. The predominant practice is for the panelists to travel to a single location, usually NSF, and meet face-to-face for one to five days. In FY 2010, approximately 1,800 review panels were held. Of these, just over one quarter involved six or fewer panelists. Face-to-face panels impose a significant time burden on the reviewers, making some potential reviewers reluctant to participate. For example, panelists with young children may not be able to obtain two continuous days of childcare, or panelists in remote locations or foreign countries may find the amount of travel required prohibitive. It also causes NSF to incur significant travel costs. This goal will test ways to make NSF’s merit review process more innovative, sustainable and family-friendly to encourage greater participation and the best use of resources. NSF makes extensive use of panels of reviewers to evaluate proposals. The predominant practice is for the panelists to travel to a single location, usually NSF, and meet face-to-face for one to five days. In FY 2010, approximately 1,800 review panels were held. Of these, just over one quarter involved six or fewer panelists. Face-to-face panels impose a significant time burden on the reviewers, making some potential reviewers reluctant to participate. For example, panelists with young children may not be able to obtain two continuous days of childcare, or panelists in remote locations or foreign countries may find the amount of travel required prohibitive. It also causes NSF to incur significant travel costs.
Potential Methods and Processes As used here, the term “virtual panel” refers to a panel meeting in which the reviewers do not travel to a common location but instead participate via teleconference, videoconference or an online meeting technology. In FY 2012, administrative offices and program staff will collaborate to develop: • An internal web-site that provides guidance to NSF staff on when to choose a virtual panel and how best to implement such panels; and • Training opportunities for NSF staff and reviewers. During the FY 2013 pilot, several different technologies to support virtual panels will be made available to programs experimenting with virtual panels. After the pilot has been completed, an evaluation of the effectiveness and efficiency of different approaches will be made and used to inform development of future policies on the use of virtual panels. As used here, the term “virtual panel” refers to a panel meeting in which the reviewers do not travel to a common location but instead participate via teleconference, videoconference or an online meeting technology. In FY 2012, administrative offices and program staff will collaborate to develop: • An internal web-site that provides guidance to NSF staff on when to choose a virtual panel and how best to implement such panels; and • Training opportunities for NSF staff and reviewers. During the FY 2013 pilot, several different technologies to support virtual panels will be made available to programs experimenting with virtual panels. After the pilot has been completed, an evaluation of the effectiveness and efficiency of different approaches will be made and used to inform development of future policies on the use of virtual panels.
Trend information NSF has experimented with virtual panels at a small scale for several years. In FY 2011, approximately one percent of panels were virtual panels. NSF has experimented with virtual panels at a small scale for several years. In FY 2011, approximately one percent of panels were virtual panels.
Lead Organization Office of Integrative Activities Office of Integrative Activities

Other Goals, Priorities, and Activities

This section provides information required by the GPRA Modernization Act on discontinued goals, lower-priority programs, unnecessary reports to Congress, and management challenges.

Changes from FY 2011

Most FY 2011 goals continued into FY 2012 and FY 2013. Several had modified goal statements but are logical follow-ons to former activities. A few were discontinued as GPRA goals, although activities towards the objectives continue. For more about these goals see the FY 2011 Performance Report.

Table 23
Strategic Goal FY 2011 Annual Performance Goal Status in FY 2012-FY 2013 Performance Plan
Transform the Frontiers T-1.1 Potentially Transformative Research T-2.1 STEM Workforce Priority Goal T-3.1 International Implications T-4.1 Construction Project Monitoring T-4.2 Data Management Practices at Large Facilities Modified Modified--new Priority Goal Continuing Continuing Modified--new Priority Goal
Innovate for Society I-1.1 IIP Grantees’ Partnerships I-2.1 Public Understanding and Communication I-2.2 K-12 Components I-3.1 Innovative Learning Systems I-3.2 Partnerships for Learning Technologies Continuing Continuing Continuing Continuing Discontinued
Perform as a Model Organization M-1.1 Model EEO Agency M-1.2 IPA Performance Plans M-1.3 360 Degree Evaluation Instrument M-2.1 Staff Developmental Needs M-3.1 Grant-By-Grant Payments M-3.2 Time to Decision Continuing Continuing Discontinued Continuing Continuing Continuing

        

Lower-Priority Programs

The 2013 Cuts, Consolidations, and Savings (CCS) Volume of the President’s Budget identifies the lower-priority program activities under the GPRA Modernization Act (31 U.S.C. 1115(b)(10)). The public can access the CCS volume at: http://www.whitehouse.gov/omb/budget.

Burden Reduction/Unnecessary Plans and Reports to Congress

The GPRA Modernization Act 2010 requires that agencies identify which of the plans and reports they provide to Congress are outdated or duplicative of other required plans and reports. The complete list of reports that NSF suggested for consolidation or elimination can be found in the President’s Budget: http://www.whitehouse.gov/omb/budget.

Management Challenges

A discussion of agency management challenges can be found in the FY 2011 Agency Financial Report, http://www.nsf.gov/pubs/2012/nsf12001/.

FY 2011 Annual Performance Report

In FY 2011, NSF set 16 performance goals, which between them cover all program activities within the agency. 13 were achieved in FY 2011; achievement for the remaining three was delayed, but all were achieved by the time of publication of this Report. Below is a tabular overview.

Table 24

Strategic Goal Annual Goal FY 2011 Result
Transform the Frontiers T-1.1 Potentially Transformative Research T-2.1 STEM Workforce Priority Goal T-3.1 International Implications T-4.1 Construction Project Monitoring T-4.2 Data Management Practices at Large Facilities Achieved Achieved Achieved Achieved Achieved
Innovate for Society I-1.1 IIP Grantees’ Partnerships I-2.1 Public Understanding and Communication I-2.2 K-12 Components I-3.1 Innovative Learning Systems I-3.2 Partnerships for Learning Technologies Achieved Achieved Achieved Achieved Achieved
Perform as a Model Organization M-1.1 Model EEO Agency M-1.2 IPA Performance Plans M-1.3 360 Degree Evaluation Instrument M-2.1 Staff Developmental Needs M-3.1 Grant-By-Grant Payments M-3.2 Time to Decision Achieved Achieved Not met (achieved 10/2011) Achieved after deadline Not met (achieved 12/2011) Achieved

      

The following pages present the results for each goal individually. Goals are presented in their strategic context, with reference to strategic goals, objectives, and targets from NSF’s FY 2011-FY 2016 Strategic Plan (see the first section of this chapter). The majority of FY 2011 goals were new because NSF’s Strategic Plan introduced impact-oriented goals that could not be measured with existing measures or techniques. Therefore, multiple years of trend data are available only for NSF’s long-standing quantitative performance measures, time to decision (M-3.2) and construction cost and schedule variance (T-4.1). A few goals monitor activities begun in recent years and therefore have limited historical or trend data (e.g. T-1.1, T-2.1, M-1.2).

A statement by the NSF Director verifying the reliability and completeness of the performance data in this report can be found in the FY 2011 Performance and Financial Highlights report at http://www.nsf.gov/about/history/annual-reports.jsp.

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-1

Make investments that lead to emerging new fields of science and engineering and shifts in existing fields.

Strategic Target

The NSF portfolio fully incorporates emerging areas with transformative potential, including those forming at disciplinary boundaries.

Lead Organization

Office of the Director.

Goal T-1.1 Potentially Transformative Research (PTR)
Table 25
Fiscal Year Goal Statement and Target Target Measure, Milestone, or Deliverable Result
2010 (new goal) Each directorate in the Research and Related Activities account will invest a minimum of $2.0 million per research division to leverage and facilitate activities that foster PTR. $94.0 million Achieved: $138.44 million
2011 Produce an analysis of NSF’s FY 2010 investments in activities undertaken to foster potentially transformative research. Deliverable: One analysis. Achieved: Report delivered in fourth quarter.
Discussion

NSF identifies PTR as work that may lead to:

  • Dramatically new ways of conceptualizing or addressing major scientific and technological challenges, or
  • New methods or analytical techniques that could put a discipline on a new scientific pathway, provide tools that allow unprecedented insights, or radically increase the rate of data collection.

In FY 2010, each Research and Related Activities (R&RA) directorate allocated a minimum of $2.0 million per research division ($94.0 million Foundation-wide) to explore methodologies that help support PTR. Each directorate devised its own methods to distribute the funds.

In FY 2011, using information collected from NSF directorates and offices, an analysis of the methods used to identify and/or facilitate potentially transformative research was performed. Challenges to these processes were also assessed. This analysis was drafted into an internal report including recommendations on how NSF can continue to promote PTR in the future.

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-2

Prepare and engage a diverse science, technology, engineering, and mathematics (STEM) workforce motivated to participate at the frontiers.

Strategic Target

NSF STEM workforce development programs, models, or strategies have rigorous evidence about the impact on diversity and innovation in the workforce.

Lead Organization

Directorate for Education and Human Resources.

Goal T-2.1 STEM Workforce Priority Goal
Table 26
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2010 Develop goals and metrics for NSF’s programmatic investments in its FY 2010 Learning portfolio. 100 percent of programs (baseline: 80 percent) Achieved: 100 percent of programs that received funding in FY 2010.
2011 NSF science, technology, engineering, and mathematics (STEM) workforce development programs at the graduate, professional, or early career level participate in evaluation and assessment systems. (Priority Goal) Six programs Achieved: 12 programs

 

Discussion

NSF’s Learning portfolio includes activities funded by the Education and Human Resources (EHR) and R&RA accounts. In FY 2009, an EHR working group developed performance metrics for all EHR programs. In FY 2010, efforts continued: EHR expanded and refined these goals and metrics, goals and metrics were developed for R&RA account programs, and all programs submitted evaluation plans.

These can be found at http://nsf.gov/about/budget/fy2012.

NSF’s FY 2010-2011 Priority Goal built on the learning portfolio metrics activities. Achieving the Priority Goal in FY 2011 also achieved Goal T-2.1. The following programs met the target:

  • Alliances for Graduate Education and the Professoriate (AGEP)
  • Faculty Early Career Development Program (CAREER)
  • Fellowships for Transformative Computational Science using Cyberinfrastructure (CI-TraCs)
  • Earth Sciences Postdoctoral Fellowship (EAR-PF)
  • Graduate Research Fellowship Program (GRF)
  • Integrative Graduate Education and Research Traineeship Program (IGERT¬)
  • International Research Fellowship Program (IRFP)
  • Mathematical Sciences Postdoctoral Research Fellowships (MSPRF)
  • Robert Noyce Teacher Scholarship Program (NOYCE)
  • Opportunities for Enhancing Diversity in the Geosciences (OEDG)
  • Postdoctoral Research Fellowship in Biology (PRFB)
  • Scholarship for Service/Cybercorps (SFS)

For more information on the achievement of the Priority Goal, see the Priority Goal section of this chapter or the Additional Performance Information at http://www.nsf.gov/about/budget/fy2013.

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-3

Keep the United States globally competitive at the frontiers of knowledge by increasing international partnerships and collaborations.

Strategic Target

NSF programs increasingly establish international partnerships that advance the frontiers of knowledge.

Lead Organization

Office of International Science and Engineering (OISE).

Goal T-3.1 International Implications
Table 27
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Identify number of new NSF program solicitations, announcements, and Dear Colleague Letters with international implications. Establish baseline. Achieved. Baseline: 23 solicitations, announcements, and Dear Colleague Letters

 

Discussion

NSF has a system for program officers to indicate which solicitations, announcements, and Dear Colleague Letters have international implications in the internal clearance stages. OISE conducted a baseline count of these materials. The solicitation numbers are listed in parentheses below.

  • Wiki for Enabling International Partnerships for the BREAD (Basic Research to Enable Agricultural Development) Program (11-017)
  • US-China Collaborative Research in Advanced Sensors and Bio-Inspired Technologies (11-024)
  • Japan/New Zealand Earthquakes/Tsunami (11-045)
  • NSF-Deutsche Forschungsgemeinschaft (DFG) Collaborative Research (11-053)
  • The "Earth Cube" - Towards a National Data Infrastructure for Earth System Science (11-065)
  • G8 Multilateral Funding Initiative "Interdisciplinary Program on Material Efficiency - A first step towards sustainable manufacturing” (11-068)
  • Dear Colleague Letter: United States and Ireland (11-070)
  • Catalyzing New International Collaborations (11-508)
  • Network for Earthquake Engineering Simulation Research (11-512)
  • Ethics Education in Science and Engineering (11-514)
  • Cyberinfrastructure Training, Education, Advancement, and Mentoring for Our 21st Century Workforce (CI-TEAM) (11-515)
  • Dimensions of Biodiversity (11-518)
  • Science and Technology Centers: Integrative Partnerships (11-522)
  • Metabolomics for a Low Carbon Society (11-527)
  • Research Coordination Networks (RCN) (11-531)
  • Software Infrastructure for Sustained Innovation (SI2) (11-589)
  • Centers for Chemical Innovation (CCI) (11-552)
  • Partnerships for International Research and Education (PIRE) (11-564)
  • Faculty Early Career Development (CAREER) (11-690)
  • Sustainability Research Networks Competition (SRN) (11-574)
  • NSF Graduate Research Fellowship Program (GRFP) (11-582)
  • International Collaboration in Chemistry between US Investigators and their Counterparts Abroad (ICC) (11-585)
  • Materials World Network: Cooperative Activity in Materials Research between US Investigators and their Counterparts Abroad (MWN) (11-568)

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-4

Enhance research infrastructure and promote data access to enable transformation at the frontiers.

Strategic Target

NSF prioritizes and manages facility investments throughout their life-cycle in a transparent and effective way.

Lead Organization

Large Facilities Office.

Goal T-4.1 Construction Project Monitoring
Table 28
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
Ongoing For all MREFC facilities under construction, keep negative cost and schedule variance at or below 10 percent. 100 percent of construction projects that are over 10 percent complete Achieved: 100 percent

 

Construction Project Monitoring Performance Trends, FY 2006-FY 2011
Figure 3

Construction Project Monitoring Performance Trends, FY 2006-FY 2011.png

Four facilities under construction were over 10 percent complete at the end of FY 2011. Of those four, all had cost and schedule variances under 10 percent. One of the projects, the Ocean Observatories Initiative, was rebaselined in FY 2011 which resulted in revised earned value variance bases.

The Major Research Equipment and Facilities Construction (MREFC) account supports the acquisition, construction, and commissioning of major research facilities and equipment that provide unique capabilities at the frontiers of science and engineering. Performance of construction projects funded by the MREFC account is monitored using the Earned Value Management (EVM) system. EVM is an integrated management control system for assessing, understanding, and quantifying what a contractor or field activity is achieving with program dollars. Monitoring cost and schedule is a standard measure of performance for construction projects.

Projects that are under 10 percent complete are not considered eligible for this goal because EVM data is less meaningful statistically in the very early stages of a project. Early in a project, the actual cost of the work, and the total value of the work scheduled and performed, are small compared to the total project cost and schedule. Consequently, their ratios—the reported cost and schedule variances—can change by large amounts even though the real values of their differences are small.

Strategic Goal 1: Transform the Frontiers

Strategic Objective/Performance Goal T-4

Enhance research infrastructure and promote data access to support researchers’ and educators’ capabilities and to enable transformation at the frontiers.

Strategic Target

Ensure data generated by NSF’s major multi-user facilities are widely accessible to the research community.

Lead Organization

Directorate for Mathematics and Physical Sciences.

Goal T-4.2 Data Management Practices at Large Facilities
Table 29
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Determine current data management practices at NSF-funded facilities
Current data management practices documented for 100 percent of NSF-funded facilities.
Achieved: 17 of 17 facilities.

 

The universe of facilities was defined as those appearing by name on the “Major Multi-User Research Facilities Funding” table in NSF’s FY 2011 Budget Request’s Facilities chapter:

  • Academic Research Fleet
  • Cornell High Energy Synchrotron Source/Cornell Electron Storage Ring
  • EarthScope
  • Gemini Observatory
  • Incorporated Research Institutes for Seismology
  • Integrated Ocean Drilling Program
  • Large Hadron Collider
  • Laser Interferometer Gravitational Wave Observatory
  • National Astronomy and Ionosphere Center
  • National Center for Atmospheric Research
  • National High Magnetic Field Laboratory
  • National Nanotechnology Infrastructure Network
  • National Optical Astronomy Observatory
  • National Radio Astronomy Observatory
  • National Solar Observatory
  • National Superconducting Cyclotron Laboratory
  • Network for Earthquake Engineering Simulation

During FY 2011, these facilities provided documentation of their data management plans that contained the following elements:

  • Types: Types of data, samples, physical collections, software, curriculum materials, and other materials that are managed and shared;
  • Standards: The standards used for data and metadata format and content (where existing standards are absent or deemed inadequate, this should be documented as well);
  • Access Policies: Policies for access and sharing including provisions for appropriate protection of privacy, confidentiality, security, intellectual property, or other rights or requirements;
  • Re-use Policies: Policies and provisions for re-use, re-distribution, and the production of derivatives; and
  • Archival: Plans for archiving data, samples, and other research products, and for preservation of access to them.

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-1

Make investments that lead to results and resources that are useful to society.

Strategic Target

NSF investments underpin long-term solutions to societal challenges such as economic development, climate change, energy, and cyber-security.

Lead Organization

Directorate for Engineering (ENG).

Goal I-1.1 IIP Grantees’ Partnerships
Table 30
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Industrial and Innovation Partnerships (IIP): Identify the number and types of grantee’s partnerships. Establish baseline. Achieved. Baseline:1,567 partnerships

 

Using ENG’s IIP division as the model to start the process of collecting data on diverse types of partnerships is intended as the beginning of a process to identify how the links between science, industry, and innovation mediate the long term impacts of NSF investments.

The baseline for all SBIR/STTR, PFI and I/UCRC partnerships from FY 2010 is in the following table.

Table 31
PARTNER TYPE TOTAL SUB-TYPE COUNT
For Profit 929 < 500 Employees 453
For Profit 929 > 500 Employees 476
For Profit 929 Corporate Ventures 0
Not For Profit 95 Foundations 14
Not For Profit 95 501C3s 14
Not For Profit 95 Consortia/Associations 67
Government 205 State 41
Government 205 Local 30
Government 205 Federal 125
Government 205 Foreign 9
Academic 296 Community Colleges 6
Academic 296 Colleges 7
Academic 296 Universities 283
Investors 42 Angels 29
Investors 42 Venture Capitalists 13
Unknown/Undisclosed 445   445
Total Number of Partnerships     1,567

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-2

Build the capacity of the nation’s citizenry for addressing societal challenges through science and engineering.

Strategic Target

NSF’s scientific literacy and public engagement programs are supported by rigorous evidence about learning outcomes.

Lead Organization

Division of Research on Learning in Formal and Informal Settings (DRL), EHR.

Goal I-2.1 Public Understanding and Communication
Table 32
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Identify number of programs that fund activities that address public understanding and communication of science and engineering. Establish baseline. Achieved. Baseline: 16 programs

 

Certain DRL programs explicitly aim to address public understanding and communication of science and engineering, but other NSF activities may also work towards this aim. This Goal’s intent was to identify all such activities across the Foundation. A keyword search of NSF program solicitations using the term “public understanding” was used to generate the baseline. As of September 27, 2011, 16 programs had this phrase in their solicitations. The solicitation numbers are listed in parentheses below.

  • OPP: Antarctic Artists and Writers Program (11-549)
  • OPP: Antarctic Research (11-532)
  • GEO: Centers for Ocean Sciences Education Excellence (10-527)
  • BIO: Collections in Support of Biological Research (11-558)
  • OIA/Cross-Cutting: Experimental Program to Stimulate Competitive Research: Workshop Opportunities (EPS) (06-583)
  • OIA/Cross-Cutting: EPSCoR Research Infrastructure Improvement Program: Track-1 (11-565)
  • GEO: Geoscience Education (10-512)
  • GEO: Opportunities for Enhancing Diversity in the Geosciences (OEDG) (10-599)
  • Cross-Cutting: Climate Change Education (CCE): Climate Change Education Partnership (CCEP) Program, Phase I (10-542)
  • Cross-Cutting: Nanoscale Science and Engineering Education (05-543)
  • Cross-Cutting: Science and Technology Centers: Integrative Partnerships (11-522)
  • Cross-Cutting: Sustainable Energy Pathways (11-590)
  • EHR: Informal Science Education (11-546)
  • EHR: Innovative Technology Experiences for Students and Teachers (11-525)
  • EHR: Research and Evaluation on Education in Science and Engineering (10-586)
  • EHR: Research on Gender in Science and Engineering FY 2010 (10-516)

Searches performed using awarded project descriptions and abstracts identified a number of programs that include the focus in ways other than specifically mentioned as part of a formal program solicitation, but a baseline based on those numbers was not used because the context in which the phrase “public understanding” is used in funded project descriptions varies.

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-2

Build the capacity of the nation’s citizenry for addressing societal challenges through science and engineering.

Strategic Target

NSF’s K-12 STEM education investments are designed and tested for scale-up.

Lead Organization

Directorate for Education and Human Resources.

Goal I-2.2 K-12 Components
Table 33
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Identify number of programs that fund activities with K-12 components. Establish baseline. Achieved. Baseline: 16 programs.

 

There is increasing interest across the federal government not just to count the number of programs addressing K-12 education, but to examine the potential of projects for going to scale and moving beyond the initial site to be implemented successfully under typical conditions and with population groups that are broadly reflective of that intended for the scale-up setting (Draft Evidence Standards). A more accurate and complete list of NSF’s K-12 programs is the first step in identifying the programs that have the capacity to identify the conditions that enable projects to go to scale.

The websites of all NSF directorates and divisions were searched for evidence of an explicitly stated K-12 programmatic mission or a programmatic component directed explicitly at K-12 education. Sixteen programs were identified:

  • CISE: Research Assistantships for High School Students (RAHSS)
  • CISE: Computing Education for the 21st Century (CE21)
  • CISE: Research Experiences for Teachers (RET) Supplement
  • CISE/EHR/SBE: Cyberlearning: Transforming Education
  • EHR: Discovery Research K-12
  • EHR: Innovative Technology Experiences for Students and Teachers (ITEST)
  • EHR: Math and Science Partnership (MSP)
  • EHR: Research & Evaluation on Education in S&E (REESE)
  • EHR: Transforming STEM Learning (TSL) (Combines with DRK-12 in FY 2012)
  • EHR: Advanced Technological Education (ATE)
  • ENG: Research Experiences for Teachers (RET) Sites
  • GEO: Geoscience Education (GeoEd)
  • GEO: Opportunities for Enhancing Diversity in the Geosciences (OEDG)
  • GEO: GEO Teach
  • SBE: Science of Learning Centers (SLC)
  • Multiple: Climate Change Education (CCE) Phase II (Not yet implemented)

The National STEM Digital Library (NSDL) program was originally on the list, but was last funded in FY 2011. The Presidential Awards for Excellence in Mathematics and Science Teaching (PAEMST) is a K-12 program, but since it is a teacher recognition program, has no potential for going to scale, and was not included.

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-3

Support the development of innovative learning systems.

Strategic Target

NSF invests in innovative learning tools and structures that use emerging technologies and are tested for effectiveness and scalability.

Lead Organization

Directorate for Education and Human Resources (EHR).

Goal I-3.1 Innovative Learning Systems
Table 34
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Identify number of programs that fund the development of research-based innovative learning systems Establish baseline. Achieved. Baseline: 150 awards within 28 program elements.


 This Goal’s intent was to identify activities across the Foundation that contribute to development of innovative learning systems. Such activities are not funded by any one program within NSF. After determining NSF’s baseline for this area of research, targets for subsequent years can be designed.

A latent semantic analysis tool was used to find awards made in FY 2011 that fit into the general category of Research-Based Innovative learning Systems (ILS). 150 awards were identified, 95 percent of which were made by eight divisions within the EHR, Computer and Information Science (CISE), and Engineering (ENG) directorates. The awards corresponded to 28 distinct programs.

Strategic Goal 2: Innovate for Society

Strategic Objective/Performance Goal I-3

Support the development of innovative learning systems.

Strategic Target

New partnerships among scientists, engineers, and educators (both theorists and practitioners) take innovations from development to practice.

Lead Organization

Directorate for Education and Human Resources.

Goal I-3.2 Partnerships for Learning Technologies
Table 35
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Identify number of programs that fund activities that promote partnerships that support development of learning technologies. Establish baseline. Achieved. Baseline: 14 programs.


 Interdisciplinary partnerships that support development of learning technologies are funded by organizational units across the Foundation. This Goal’s intent is to identify all such activities so an NSF-wide baseline can be determined. 14 core programs with capacity to fund partnerships for learning technologies were identified. The solicitation numbers are listed in parentheses below.

  • CISE: Computing Education for the 21st Century (CE21) (10-619)
  • OIA: Cyber-Enabled Discovery and Innovation (CDI) (11-502)
  • OCI: Cyberinfrastructure Training, Education, Advancement, and Mentoring for Our 21st Century Workforce (CI-TEAM) (11-515)
  • CISE: Cyberlearning: Transforming Education (11-587)
  • EHR: Discovery Research K-12 (DR K-12) (11-588)
  • CISE: Human-Centered Computing (HCC) (11-556)
  • CISE: Human-Robot (and/or Agents) Interaction (HRI) (11-556)
  • EHR: Informal Science Education (ISE) (11-546)
  • EHR: Innovative Technology Experiences for Students and Teachers (ITEST) (11-525)
  • EHR: Math and Science Partnership (MSP) (10-556)
  • EHR: Research and Evaluation on Education in Science and Engineering (REESE) (10-586)
  • SBE: Science of Learning Centers (SLC)
  • EHR: Transforming STEM Learning (TSL) (10-602)
  • EHR: Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics (TUES) (10-544)

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-1

Achieve management excellence through leadership, accountability, and personal responsibility.

Strategic Target

More effective management enables all staff to understand how their duties support the mission of the Foundation.

Lead Organization

Division of Human Resources Management (HRM).

Goal M-1.1 Intergovernmental Personnel Agreement (IPA) Performance Plans
Table 36
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Include temporary staff appointed under the Intergovernmental Personnel Act (IPAs) under NSF’s performance management system.. 80 percent of all IPAs have performance plans as of July 1, 2011 Achieved: 92 percent of all IPAs had performance plans as of July 1, 2011
2011 Include temporary staff appointed under the Intergovernmental Personnel Act (IPAs) under NSF’s performance management system. 90 percent of IPAs in executive-level positions have performance plans as of July 1, 2011 Achieved: 90 percent of executive IPAs had performance plans as of July 1, 2011.


This goal is designed to establish agency-wide guidance for assessing IPAs’ performance and to ensure consistent implementation. It was formulated to address specific human resource management challenges identified by Congress, the Office of Personnel Management, and NSF’s Office of the Inspector General.

The Intergovernmental Personnel Act (IPA) Mobility Program allows the temporary assignment of personnel between Federal agencies and other governmental, academic, tribal, and eligible non-profit organizations. IPA assignments can strengthen management, assist in the transfer and implementation of new technology, involve officials of other organizations in developing and implementing Federal policies and programs, and enhance the professional abilities of the participants. IPA assignments are management-initiated and are made for the benefit of Federal agencies and non-Federal organizations.

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-1

Achieve management excellence through leadership, accountability, and personal responsibility.

Strategic Target

More effective management enables all staff to understand how their duties support the mission of the Foundation.

Lead Organization

Division of Human Resources Management.

Goal M-1.2 360 Degree Evaluation Instrument
Table 37
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Pilot use of OPM’s 360 degree evaluation instrument to provide feedback to NSF leaders and managers on skills and abilities. By July 1, 2011, at least 20 NSF managers use OPM’s 360 instrument. Achieved: 27 managers by 7/1/11
2011 Pilot use of OPM’s 360 degree evaluation instrument to provide feedback to NSF leaders and managers on skills and abilities. By September 30, 2011, at least 20 NSF managers who used OPM’s 360 instrument establish a plan for improving performance. Achieved late: six managers by 9/30/2011, 29 managers by 10/15/2011

 

This aim of this goal is to provide managers “structured feedback about leadership skills from their supervisors, peers, subordinates, and themselves.” The feedback is intended to be developmental in nature and help managers “identify leadership strengths and developmental opportunities.” It was formulated to address specific human resource management challenges identified by Congress, the Office of Personnel Management, and NSF’s Office of the Inspector General.

In FY 2011, a number of executive-level managers were invited to participate in the OPM 360 assessment process. Twenty-seven NSF managers completed the OPM 360 Leadership Assessment by the target date of July 1 2011, exceeding the target of 20 managers.

Information on delayed achievement

Only six of the managers who participated in the OPM 360 Assessment submitted an Executive Level Development Plan (EDP) for improving performance by 9/30/2011. NSF EDPs are not traditionally due until October 15. Twenty-nine managers who used the OPM 360 Evaluation Instrument had submitted updated performance plans by the 10/15/2011 standard submission date.

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-1

Achieve management excellence through leadership, accountability, and personal responsibility.

Strategic Target

More effective management enables all staff to understand how their duties support the mission of the Foundation.

Lead Organization

Office of Diversity and Inclusion.

Goal M-1.3 Model EEO Agency
Table 38
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Attain essential elements of a model Equal Employment Opportunity (EEO) program, as defined in Equal Employment Opportunity Commission (EEOC) requirements. Three elements. Achieved: Three elements obtained.


For NSF to achieve model EEO agency status, it must meet and maintain each of the six criteria established by the Equal Employment Opportunity Commission (EEOC). The EEOC refers to these criteria as the “Essential Elements” of a Model Agency, which are:

A. Demonstrated commitment from agency leadership;
B. Integration of EEO into the agency's strategic mission;
C. Management and program accountability;
D. Proactive prevention of unlawful discrimination;
E. Efficiency; and
F. Responsiveness and legal compliance.

The target of three is based on the progress reported in FY 2010 as compared to resource-responsive expectations for FY 2011. NSF took the following actions to achieve measures that were unmet in FY 2010 as related to essential elements A, B, and E:

A. Demonstrated commitment from agency leadership

  • Training that included Alternative Dispute Resolution was offered to all managers and supervisors in which there was 100 percent participation.
  • Training that included a module on religious accommodations was offered to all managers and supervisors to ensure they have clear understanding of their roles and responsibilities, in which there was 100 percent participation.

B. Integration of EEO into the agency's strategic mission

  • In FY 2011, ODI’s Director conducted NSF’s first “State of the Agency” briefing, covering all components of the EEO annual report to the EEOC, to NSF senior officials.

E. Efficiency

  • ODI worked closely with the Office of the General Counsel (OGC), which represents the agency on EEO matters, to ensure impartiality in EEO matters processed within ODI.

Additionally, in conducting its annual barrier analysis, ODI accomplished the following:

  • Worked with senior managers in three directorates in identifying potential barriers to EEO.
  • Worked with these senior managers in devising plans to eliminate barriers, implementing such plans, and ensuring that ODI plays a role in assessing the effectiveness of such plans.

Strategic Goal 3: Perform as a Model Organization

Strategic Goal 3

Perform as a Model Organization

Strategic Objective/Performance Goal M-2

Infuse learning as an essential element of the NSF culture with emphasis on professional development and personal growth.

Strategic Target

NSF emphasizes learning for personal and professional development for all staff.

Lead Organization

Division of Human Resources Management.

Goal M-2.1 Assess Developmental Needs
Table 39
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Pilot process for assessing developmental needs and addressing them. By March 31, 2011 commence survey of administrative support staff. Achieved.
2011 Pilot process for assessing developmental needs and addressing them. By September 20, 2011, obtain contract support for assessment of non-administrative-support staff. Achieved late: Contract support obtained September 23, 2011.


NSF stresses personal learning and development to enhance performance, further our knowledge base on all aspects of NSF activity, and continue to build for the future. This directly reflects the specific action identified in the Strategic Plan: “review current NSF learning opportunities and develop a plan for addressing gaps.”

In FY 2011, the Division of Human Resource Management (HRM) developed and launched targeted needs analysis questionnaires designed to generate new learning needs data. Questionnaires asked NSF administrative professional staff to rate the performance of specific skills necessary to complete their work, and rated the importance of creating additional skill-based learning and development opportunities to help them successfully complete their work. Approximately 38 percent of administrative support staff participated in the survey.

Information on delayed achievement

In support of the target “to obtain contract support for assessment of non-administrative-support staff”, in the Third Quarter of FY 2011, OPM issued a contract solicitation for this work. The results of NSF’s evaluation were sent to OPM on 8/8/11. OPM issued a Notice to Proceed to NSF for this contract on 9/23/11. While this notice to proceed fell beyond the target agreement date of 9/20/11, NSF had minimal control of the contract procurement effort following the contract solicitation issuance in Third Quarter FY 2011

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-3

Encourage and sustain a culture of creativity and innovation across the agency to ensure continuous improvement and achieve high levels of customer service.

Strategic Target

NSF uses the innovation and creativity of our staff to improve agency processes and systems on a continuing basis.

Lead Organization

Division of Financial Management, Office of Budget, Finance, and Award Management (BFA).

Goal M-3.1 Grant-By-Grant Payments
Table 40
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
2011 Gather functional requirements for changes in current system processes that will accommodate the transition to a grant by grant payment method. Documentation of functional requirements Achieved late: Functional requirements delivered first quarter of FY 2012


Establishing the capability to monitor expenditures at the award level is an essential aspect of NSF’s financial system modernization. NSF is committed to transition its financial processing of grants from a pooled system (quarterly reporting of expenditures by institution) to grant-by-grant (near real-time reporting of expenditures by award) by FY 2013. This change will have many advantages for both NSF and its grantees, such as better and more timely financial data and stronger monitoring programs. This initiative is per the Administration and CFO directive to increase efficiency and transparency in the agency.

Information on delayed achievement

The FY 2011 target for this goal was the documentation of functional requirements that will accommodate transition to a real-time payment method. Documentation detailing final business requirements and draft functional requirements was prepared by BFA in FY 2011. However, during cross-agency working group meetings, additional requirements were raised that needed to be addressed. Consequently, this target was not met in FY 2011. In fulfillment of the goal, the requirements were delivered in December 2011.

Strategic Goal 3: Perform as a Model Organization

Strategic Objective/Performance Goal M-3

Encourage and sustain a culture of creativity and innovation across the agency to ensure continuous improvement and achieve high levels of customer service.

Strategic Target

NSF organizations achieve high levels of customer satisfaction.

Lead Organization

Office of the Director.

Goal M-3.2 Time to Decision
Table 41
Fiscal Year Goal Statement Target Measure, Milestone, or Deliverable Result
Ongoing Inform applicants whether their proposals have been declined or recommended for funding within six months of deadline, target date, or receipt date, whichever is later. 70 percent. 78 percent.

 

Time to Decision Performance Trends, FY 2006-FY 2011
Figure 4

Time to Decision Performance Trends, FY 2006-FY 2011.png

* In FY 2009, this goal was in effect only for the period October 1 through December 31, 2008 (Quarter 1, FY 2009). The goal was suspended for all actions taking place between January 1, 2009 and September 30, 2009 to allow for a greater number of proposals to be processed with the additional funds from the American Recovery and Reinvestment Act of 2009 (ARRA).

Time to decision or “dwell time” is the amount of time that passes between receipt of a proposal and notification of the principal investigator about the funding decision made about the proposal. One of the most significant issues raised in customer satisfaction surveys is the time it takes NSF to process proposals. Too long a time period inhibits the progress of research as it delays the funding process, but too short a time period may inhibit the merit review process. The six-month target seeks to strike a balance between the need of the investigator for timely action and the need of NSF for a credible and efficient merit review system.

The most relevant recent variations in performance took place in FY 2009 and FY 2010. In FY 2009, the goal was suspended after the first quarter to allow for a greater number of proposals to be processed with additional funds from ARRA. The goal was reinstated in FY 2010, when NSF exceeded this goal despite a significant increase in workload. Overall, staffing levels increased by 5.5 percent between FY 2008 and FY 2011, while proposal pressure increased by 17.4 percent.

FY 2011 External Program Evaluations List

NSF relies on the judgment of external experts to maintain high standards of program management, to provide advice for continuous improvement of NSF performance, and to ensure openness to the research and education community served by the Foundation. NSF obtains this feedback through periodic scheduled reviews such as external evaluations, Committees of Visitors, directorate Advisory Committees, and facilities reviews.

External Evaluations Conducted by Contractors

NSF directorates, divisions, and programs use the recommendations of external experts in their decision-making. During FY 2011, seven external evaluations of NSF’s existing programs and strategic investments were published. These evaluations include the results of studies, reports, and workshops commissioned by various programmatic offices within the National Science Foundation

  • ENG: Research Experiences For Undergraduates in the Directorate For Engineering (ENG): Follow-up of FY 2006 Student Participants
  • ENG: External education evaluations conducted by Engineering Research Center (ERC) Lead Universities
  • ENG: Assessment of the National Science Foundation’s Emerging Frontiers in Research and Innovation (EFRI) Program
  • ENG: Grand Challenges in Earthquake Engineering Research: A Community Workshop Report
  • MPS: Feasibility Study for Evaluation of the Mathematical Science Research Institutes
  • MPS: Findings of the Feasibility Study for Evaluation of the Phased Approach Used to Implement the Centers for Chemical Innovation
  • OPP: Future Science Opportunities in Antarctica and the Southern Ocean

Descriptions of these evaluations can be found on the following pages.

DIRECTORATE FOR ENGINEERING

Division of Engineering Education and Centers (EEC)
Table 42
Evaluation Name Research Experiences For Undergraduates (REU) in the Directorate for Engineering: Follow-Up of FY 2006 Student Participants
Contractor SRI International
Program Name Research Experiences for Undergraduates (REU) in Engineering
Completion date October 2010
Program Description

Chief among the programs intended to increase graduate-degree production in fields covered by the National Science Foundation (NSF) is the Research Experiences for Undergraduates (REU) program, which has been in existence for more than 20 years. ENG has two major award types for REUs—Site and Supplement awards.

Evaluation Description

This is the second phase of a study of the REU in Engineering Program. It was conducted through a follow-up survey of the 1,248 FY 2006 REU students who responded to the student survey administered in the first phase of the study. The purpose of the second data collection was to measure the longer-term outcomes of the ENG REU and other undergraduate research experiences. The follow-up survey focused on the totality of undergraduate research experiences (rather than the FY 2006 ENG REU experience), and the effects of those experiences on academic and career decisions that former REU students had made by December 2009.

The evaluation report describes the major outcomes and other findings from the 2009 follow-up survey. The summary begins with the major outcomes, continues with other overall findings, and ends with the differences by sex, race/ethnicity, and REU award type.

Findings

Key findings include the following:

  • Most respondents had earned their undergraduate degree and gone on to graduate school.
  • The majority of respondents considered their undergraduate research experiences to be a factor in their decisions about whether to go to graduate school, what field to study, and where to apply, and in being accepted into their graduate institution.
  • Most undergraduate engineering students stayed in engineering for graduate school.
  • Most respondents raised their highest degree expectations over time. Today few expect to stop at a bachelor’s degree.
  • For more than one-half of respondents, undergraduate research led to increased interest in engineering and/or research as potential careers. One-fourth of respondents reported that undergraduate research introduced them to a career they had not known existed. More than 4 in 5 respondents considered their undergraduate research experiences to have been extremely or fairly important to their career decision.
  • About 3 in 10 respondents became less interested in a research career once they had a better understanding of what was involved.
  • About 37% of respondents were employed and not in school at the time of the survey. A substantial majority of these workers had jobs that involved engineering and were employed in the for-profit sector
  • More than 7 in 10 non-student workers were using their research skills at least somewhat in their jobs.
Recommendations

By far the most frequent recommendation for how to improve undergraduate research experiences was to increase the involvement of mentors and faculty members.

Agency response to recommendations

The survey has been disseminated to all ENG REU Site Directors. Plans are to discuss recommendations with REU Site Directors during the March 2011 annual PI meeting to gain feedback from the PIs on how involvement of mentors and faculty members with undergraduate participants may be increased and improved.

Publications

Publications are located at http://csted.sri.com/content/researc...2006-student-p

Actual Cost

Pending

DIRECTORATE FOR ENGINEERING

Division of Engineering Education and Centers (EEC)
Table 43
Evaluation Name External education evaluations conducted by Engineering Research Center (ERC) Lead Universities
Contractor Various organizations, please see narrative below
Program Name Engineering Research Centers (ERC) Program of the Division of Engineering Education and Centers
Completion date Evaluations are ongoing in each separate ERC over a ten-year period
Program Description

There are currently seventeen (17) active, NSF-funded ERCs. Each ERC provides an environment in which academe and industry can collaborate in pursuing innovations in research and education that can impact curricula at all levels, from pre-college to life-long learning. These education programs are required to carry out evaluations/assessment of progress and outcomes. To accomplish this, the ERC lead university employs staff, faculty, or contractors.

During FY 2011, program evaluations of three ERCs were conducted:

  • Center for Collaborative, Adaptive, Sensing of the Atmosphere – University of Massachusetts;
  • Mid-IR Tech. for Health & the Environment – Princeton University; and
  • Synthetic Biology ERC – UC-Berkeley.
Recommendations

None of these programs provides direct recommendations to NSF. They provide feedback to the ERCs’ education program leaders and the Center Directors and the results are assessed by annual site visit review teams managed by the ERC Program.

Agency response to recommendations

N/A

Publications

N/A

Actual Cost
  • FY 2011: N/A since the evaluation costs are borne by the ERC
  • FY 2010: N/A since the evaluation costs are borne by the ERC.

DIRECTORATE FOR ENGINEERING

Emerging Frontiers in Research and Innovation (EFRI) Office
Table 44
Evaluation Name Assessment of the National Science Foundation’s Emerging Frontiers in Research and Innovation (EFRI)
Program Contractor Science and Technology Policy Institute (STPI)
Program Name Emerging Frontiers in Research and Innovation (EFRI) program
Completion date March 2011
Program Description

The Emerging Frontiers in Research and Innovation (EFRI) program was created in October 2006 after a reorganization of the Directorate for Engineering (ENG) within the National Science Foundation (NSF). The overall goal of the program is to fund higher risk opportunities for research that will lead to “new research areas for NSF, ENG, and other agencies; new industries or capabilities that result in a leadership position for the country; and/or significant progress on a recognized national need or grand challenge.” The program achieves this through funding interdisciplinary teams to conduct potentially transformative research. The first EFRI solicitation was announced in FY 2007. Since then, EFRI has supported 44 projects through grant funding totaling about $90 million.

Evaluation Description

In FY 2010–2011, ENG funded STPI to perform a formative assessment and evaluation of the processes by which topic areas and potentially transformative projects are selected by EFRI, and to design a protocol for future outcome evaluation of EFRI research projects.

Findings

The findings clustered in three areas: process related findings (related to program design and implementation); distinctiveness of the EFRI program within the Engineering Directorate (ENG), in particular that of its PIs, topics and proposals; and methodological findings.

Recommendations

The following recommendations were made:

  • Consider using just the Grand Challenge criterion for topics.
  • Use Web 2.0 and other collective intelligence tools for voting on topics.
  • Have a definition of potentially transformative research that is easier to operationalize and standardize.
  • Consider defining and specifying “cognitive integration” as a criterion rather than focusing on requiring PIs from multiple disciplines.
  • Recruit non-traditional reviewers such as maverick researchers or entrepreneurs, among others.
  • Provide standardized training regarding the criteria to ensure all panelists have the same understanding of what the EFRI program is seeking.
  • EFRI should consider a more descriptive approach to an outcome evaluation.
  • Test the emerging concept of calculating “integration scores” and data visualization tools on larger data sets to assess interdisciplinarity in the science and engineering policy community.
Agency response to recommendations

In response to this report and 2011 COV recommendations, the directorate substantially revised the research topic selection process and instituted several internal process innovations in FY 2011–2012.

Publications

Balakrishnan, A., M. B. Hughes, V. Peña, D. Roessner, B Lal. 2011. Assessment of the National Science Foundation’s Emerging Frontiers in Research and Innovation (EFRI) Program. Science and Technology Policy Institute, Washington, DC.

Actual Cost

$304,892

DIRECTORATE FOR ENGINEERING

Division of Civil, Mechanical and Manufacturing Innovation (CMMI)
Table 45
Evaluation Name Grand Challenges in Earthquake Engineering Research: A Community Workshop Report
Contractor National Research Council
Program Name George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES)
Completion date 2011
Program Description

The George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES), supported by the National Science Foundation (NSF), is an important component of the National Earthquake Hazards Reductions Program (NEHRP). NEHRP is a coordinated effort across four federal agencies to address earthquake risk in the United States. Since 2004, NEES researchers have produced significant advances in the science and technology for earthquake loss reduction that would not have been possible without the network’s experimental facilities and cyberinfrastructure. By FY 2014, NSF will have supported 10 years of NEES operations and research.

Evaluation Description

As part of NSF’s preparation of plans for FY 2014 and beyond, NSF sought an evaluation of next-generation U.S. needs for earthquake engineering research beyond 2014. At the request of NSF, the National Research Council (NRC) hosted a two-day workshop to give members of the community an opportunity to address two major questions:

  • What are the high-priority Grand Challenges in basic earthquake engineering research that require a network of earthquake engineering experimental facilities and cyberinfrastructure?
  • What networked earthquake engineering experimental capabilities and cyberinfrastructure tools are required to address these Grand Challenges?

The workshop featured invited presentations and discussion. Workshop participants were asked to describe the experimental infrastructure capabilities and cyberinfrastructure tools in terms of requirements, rather than by reference to any existing or specifically located future facilities. In responding to the foregoing questions, workshop participants were also asked to consider future technical and conceptual advances with the potential to influence future earthquake hazard research, such as early warning systems, new materials, sustainability, high-performance computing and networking, modeling, sensor and monitoring technologies, and other factors identified by the committee. The committee prepared a report summarizing discussions at the workshop.

Findings

The report did not include findings.

Recommendations

The report did not include recommendations.

Agency response to recommendations

The NSF response is under development as outlined in the NSF 10-071 Dear Colleague Letter (http://nsf.gov/pubs/2010/nsf10071/nsf10071.jsp). The response is anticipated to be final by fall 2012.

Publications

Committee for the Workshop on Grand Challenges in Earthquake Engineering Research--A Vision for NEES Experimental Facilities and Cyberinfrastructure Tools; Committee on Seismology and Geodynamics; National Research Council. 2011. Grand Challenges in Earthquake Engineering Research: A Community Workshop Report. The National Academies Press. http://www.nap.edu/catalog.php?record_id=13167.

Actual Cost

$251,052

DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES

Division of Mathematical Sciences (DMS)
Table 46
Evaluation Name Feasibility Study for Evaluation of the Mathematical Science Research Institutes
Contractor Science and Technology Policy Institute (STPI)
Program Name Mathematical Science Research Institutes
Completion date December 2010
Program Description

Eight Mathematical Science Research Institutes (“Institutes”) are currently supported by DMS. In recent years, DMS supervision of the Institutes awards has evolved from management of individual Institute awards to management of the suite of awards as a portfolio. This shift in management philosophy has generated interest in evaluation at the level of the portfolio to supplement regular NSF processes for peer review of individual Institutes.

Evaluation Description

DMS requested a study to assess the need for and feasibility of evaluation of the Institutes at the portfolio level. If formal program evaluation was determined to be warranted and feasible, an additional objective was to recommend an evaluation approach and strategy.

Findings and Recommendations

The study’s key findings are as follows:

  • A separate evaluation approach and design would be required for Institutes using each of the three convening models.
  • Evaluation of Institute for Advanced Study (IAS) and American Institute of Mathematics (AIM) as individual Institutes is feasible, but it does not appear to be warranted.
  • Evaluation of the long program convening model as implemented by Mathematical Sciences Research Institute (MSRI), Statistical and Applied Mathematical Sciences Institute (SAMSI), Institute for Pure and Applied Mathematics (IPAM), Mathematical Biosciences Institute (MBI), and Institute for Mathematics and Its Applications (IMA) is warranted.
  • Systematic evaluation of long program outcomes at the Institute level is not feasible in the short term, but it may be feasible in the long term if NSF standardizes key data collection processes.
  • Evaluation of the long-program-convening-model is feasible using primarily qualitative methods to examine individual long programs.
  • Evaluation of outcomes associated with additional education, training, and outreach activities at the various Institutes is neither feasible nor warranted.
  • Formal evaluation of outcomes associated with coordination among the Institutes is neither feasible nor warranted, but opportunities and best practices could be explored informally.
Agency response to recommendations

N/A

Publications

Zuckerman, B., C. V. Srivastava, P. C. Boardman, C. Weber, and S. Jonas. 2010. Feasibility Study for Evaluation of the Mathematical Science Research Institutes. Science and Technology Policy Institute, Washington, DC.

Actual Cost

$63,406

DIRECTORATE FOR MATHEMATICAL AND PHYSICAL SCIENCES

Division of Chemistry (CHE)
Table 47
Evaluation Name Findings of the Feasibility Study for Evaluation of the Phased Approach Used to Implement the Centers for Chemical Innovation
Contractor Science and Technology Policy Institute (STPI)
Program Name Centers for Chemical Innovation (CCI) program
Completion date June 2011
Program Description

The CCI program of CHE appears to be unique among NSF Centers programs in that it has been implemented using a phased approach. The program competitively awards up to $1.5M over three years for “Phase I” activities prior to accepting “Phase II” applications for full Center awards at the beginning of the third year.

Evaluation Description

In August 2009, NSF tasked the Science and Technology Policy Institute (STPI) of the Institute for Defense Analyses (IDA) to assess the feasibility of evaluating the phased implementation of the CCI program. The goals of the feasibility study were to identify potential research questions of interest to NSF, assess existing and potential sources of evidence about the program, and determine whether those questions can be answered. Potential research questions were identified through two sources: (1) in-depth discussions with NSF leadership about the program and priorities for evaluation; and (2) analysis of program logic. The assessment of available evidence included a thorough review of available program documents, including program announcements and applications for funding. Published literature on science policy and research evaluation was also consulted as appropriate.

Findings and Recommendations

The study’s key findings are as follows:

  • Evaluation of the phased implementation of the CCI program is warranted.
  • There are serious concerns about the feasibility of assessing whether the phased approach encouraged experimentation while mitigating the Foundation’s exposure to undesirable risk.
  • Comparative assessment of return on investment in Phase I Centers is likely feasible.
  • There are serious concerns about the feasibility of assessing changes in level of ambition or creativity from Phase I to Phase II.
Agency response to recommendations

The CCI Program accepted the STPI findings that several of the questions posed by the phased implementation of CCIs are not amenable to a program evaluation at this time. The NSF continues to develop the program, informed by multiple sources including the STPI study on evaluation design.

Publications

B. Zuckerman, C. V. Srivastava, and M. E. Hughes. 2011. Findings of the Feasibility Study for Evaluation of the Phased Approach Used to Implement the Centers for Chemical Innovation. Science and Technology Policy Institute, Washington, DC.

Actual Cost

$62,743

OFFICE OF POLAR PROGRAMS

Table 48
Evaluation Name Future Science Opportunities in Antarctica and the Southern Ocean
Contractor National Research Council
Program Name U.S. Antarctic Program (USAP)
Completion date September 2011

 

Program Description

The U.S. Antarctic Program (USAP) within the National Science Foundation (NSF) is the primary U.S. agency responsible for supporting science in Antarctica and the Southern Ocean.

Evaluation Description

In 2010, the NSF Office of Polar Programs, in coordination with the Office of Science Technology Policy, initiated two activities to provide guidance to the USAP program. In 2011, the National Research Council’s Committee on Future Science Opportunities in Antarctica and the Southern Ocean released the first report in support of this activity. The committee’s task was to identify and summarize the changes to important science conducted on Antarctica and the surrounding Southern Ocean that will demand attention over the next two decades. The second activity, an NSF-organized Blue Ribbon Panel intended to assist in making strategic decisions to improve the logistical support of the U.S. science program in Antarctica and the Southern Ocean over the next two decades, will take place in 2012.

Findings

In response to its charge, the committee highlighted important areas of research by encapsulating each into a single, overarching question. The questions fall into two broad themes: (1) those related to global change and (2) those related to fundamental discoveries. In addition, the committee identified several opportunities to be leveraged to sustain and improve the science program in Antarctica and the Southern Ocean in the coming two decades.

Recommendations

The committee made the following recommendations

  • Lead the development of a large-scale, interdisciplinary observing network and support a new generation of robust Earth system models
  • Continue to support a wide variety of basic scientific research in Antarctica and the Southern Ocean, which will yield a new generation of discoveries
  • Design and implement improved mechanisms for international collaboration.
  • Exploit the host of emerging technologies
  • Coordinate an integrated polar educational program
  • Continue strong logistical support for Antarctic science.
Agency response to recommendations

N/A

Publications

Committee on Future Science Opportunities in Antarctica and the Southern Ocean. 2011. Future Science Opportunities in Antarctica and the Southern Ocean. The National Academies Press.

Actual Cost

$465,000

Committees of Visitors (COVs)

During FY 2011, seven Directorates convened 21 Committees of Visitors in FY 2011.

Committee of Visitors (COV) reviews provide NSF with external expert judgments in two areas: (1) assessments of the quality and integrity of program operations and program-level technical and managerial matters pertaining to proposal decisions; and (2) comments on how the outputs and outcomes generated by awardees have contributed to the attainment of NSF's mission and strategic outcome goals. COV reviews are conducted at regular intervals of approximately three years for programs and offices that recommend or award grants, cooperative agreements, and/or contracts and whose main focus is the conduct or support of NSF research and education in science and engineering. Approximately one-third of NSF’s divisions are assessed each year.

A COV typically consists of up to 20 external experts, selected to ensure independence, programmatic coverage, and geographic balance, and they represent academia, industry, government, and the public sector. They meet for two or three days to review and assess program priorities, program management, and award accomplishments or outcomes. Each COV prepares a report and the division or program that is being reviewed must prepare a response. These reports and responses are submitted to the parent advisory committee and to the Director of NSF. COV recommendations must be addressed by the division director, and appropriate actions must be taken to comply. All reports and responses are public and posted on NSF’s website at: www.nsf.gov/od/oia/activities/cov/covs.jsp.

Committees of Visitors for the following divisions and programs were convened in FY 2011:

  • BIO: Integrative Organismal Biology
  • BIO: Molecular and Cellular Biosciences
  • EHR: Research on Learning in Formal and Informal Settings
  • EHR: Informal Science Education
  • EHR: Information Technology Experiences for Students and Teachers (ITEST)
  • EHR: Noyce Scholarships
  • EHR: Math and Science Partnership (MSP)
  • EHR: Scholarship for Service, Scholarships (S-STEM in FY 2007)
  • EHR: GK-12 Fellows
  • EHR: Integrative Graduate Education and Research Traineeship (IGERT)
  • EHR: ADVANCE
  • ENG: Electrical, Communications and Cyber Systems(ECCS)
  • ENG: Emerging Frontiers in Research and Innovation (EFRI)
  • GEO: GeoSpace Section
  • GEO: Surface Earth Processes Section
  • GEO: Deep Earth Processes Section
  • GEO: Integrative Projects Section
  • MPS: Astronomical Sciences
  • MPS: Materials Research
  • Office of Cyberinfrastructure
  • Office of International Science and Engineering

Strategic Plan Addendum

This addendum to the NSF Strategic Plan for FY 2011-2016 includes three new sections required by the GPRA Modernization Act: Agency Priority Goals, Federal Priority Goals, and Congressional Consultation.

FY 2012-FY 2013 Agency Priority Goals

Access to Digital Products of NSF-Funded Research
Table 49
Impact Statement Increase opportunities for research and education through public access to high-value digital products of NSF-funded research.
Goal Statement By September 30, 2013, NSF will have established policies for public access to high-value data and software in at least two data-intensive scientific domains.
Relevant Strategic Goal Transform the Frontiers
Relevant Strategic Objective T-4, “Enhance research infrastructure and promote data access to support researchers’ and educators’ capabilities and enable transformation at the frontiers.”

 

Undergraduate Programs

Table 50

Impact Statement Develop a diverse and highly qualified science and technology workforce.
Goal Statement By September 30, 2013, 80 percent of institutions funded through NSF undergraduate programs document the extent of use of proven instructional practices.
Relevant Strategic Goal Transform the Frontiers
Relevant Strategic Objective T-2: “Prepare and engage a diverse STEM workforce motivated to participate at the frontiers.”

 

Innovation Corps

Table 51

Impact Statement Increase the number of entrepreneurs emerging from university laboratories.
Goal Statement By September 30, 2013, 80 percent of teams participating in the Innovation Corps program will have tested the commercial viability of their product or service. 
Relevant Strategic Goal Innovate for Society
Relevant Strategic Objective I-1, “Make investments that lead to results and resources that are useful to society.” I-2, “Build the capacity of the nation’s citizenry for addressing societal challenges through science and engineering.”

 

FY 2012-FY 2013 Federal Priority Goals

Per the GPRA Modernization Act, P.L. 111-352, requirement to address Federal Goals in the agency Strategic Plan and Annual Performance Plan, please refer to Performance.gov for information on Federal Priority Goals and the agency’s contributions to those goals, where applicable.

Congressional Consultations

Members of the House Science Committee and Senate Commerce Committee were briefed on the content of this Plan during its development and given the opportunity for comment.

Supporting Information

Annual Performance Plan

This section provides information required by the GPRA Modernization Act on “Strategies and Supporting Analysis.” The required information includes 1) a description of how agency goals will be achieved with the reason for choosing strategies and 2) a description of how human capital programs are used to advance agency performance goals. This required information follows.

Strategies and Supporting Analysis

No one standard strategy is used across NSF for achievement of goals. Goal leaders at NSF choose strategies tailored to their stakeholders’ needs and their institutional capabilities. In general, NSF goals involve testing the impacts of new activities or new approaches to existing activities, so feedback mechanisms are built in. Use of analysis, evidence, and evaluation findings is also at the discretion of each individual goal leader. Performance at NSF is reviewed quarterly by NSF’s Performance Improvement Officer, who reports on goal progress to NSF senior management.

NSF employs a balanced set of performance indicators, milestones, and measures. Due to the nature of NSF investments, the two mission-oriented goals, Transform the Frontiers and Innovate for Society, tend to be output- or outcome-based. The management-oriented goal, Perform as a Model Organization, contains efficiency and customer-service measures, but also output and outcome measures relating to long-term activities such as financial system modernization and strategic human capital management.

Strategic Human Capital Management at NSF

NSF’s Human Capital Strategic Plan constitutes the framework for managing NSF’s human capital system through 2012 and builds upon the strength and commitment of NSF’s workforce to fulfill the Foundation’s mission. This Plan identifies internal and external factors that shape human capital planning and creates an integrated framework of policies and practices that will guide the Foundation in meeting our workforce needs and enable NSF to perform as a model organization. The Plan, which seeks to inform NSF managers and staff alike, takes a strategic approach—both in terms of identifying the human capital challenges facing the Foundation as well as how best to maximize the vitality and capabilities of NSF’s workforce at all levels. Drawing from the Plan’s interdependent goals and the more focused workforce and succession planning action strategies, NSF’s individual directorates and offices will be able to develop organizationally-specific human capital implementation strategies. The Plan is intended as a “living” document—one that accommodates changing environments and needs as they arise. This strategic plan works in concert with NSF’s annual Human Capital Accountability Plan to define clear lines of responsibility and identify specific goals for each area in OPM’s Human Capital Accountability and Assessment Framework.

The next Human Capital Strategic Plan will be developed concurrently with the next NSF Strategic Plan, to be issued February 2014.

Annual Performance Report

This section provides information required by the GPRA Modernization Act on the completeness and reliability of performance data. The required information includes 1) a discussion of the agency’s verification and validation (V&V) practices and 2) a description of data sources and known data limitations. This required information follows.

Verification and Validation of Performance Goals

It is NSF’s practice to follow Government Accountability Office (GAO) guidance and engage external contractors to conduct an independent validation and verification (V&V) review of its annual performance information, data, and processes. The guidance from GAO indicates that agencies should “…describe the means the agency will use to verify its performance data…” and “…provide confidence that [their] performance information will be credible.” 1 NSF will continue this process in FY 2012 and FY 2013.

In FY 2011, IBM Global Business Services (IBM) assessed the validity of NSF data and verified the reliability of the methods used to collect, process, maintain, and report that data, and reviewed NSF’s information systems based on GAO standards for application controls. IBM’s FY 2011 report concluded:

Overall, we verify that NSF relies on sound business practices, internal controls, and manual checks of system queries to ensure accurate performance reporting. NSF maintains adequate documentation of its processes and data to allow for an effective V&V review. Based on our assessment, IBM has confidence in the systems, policies, and procedures used by NSF to calculate results for its performance measures and targets. NSF continues to take concerted steps to improve the quality of their systems and data. We confirm NSF’s commitment to ensuring the accuracy of its reported GPRA results, and the reliability of its processes for collecting, processing, maintaining, and reporting data for its performance goals 2.

1 GAO, The Results Act: An Evaluator’s Guide to Assessing Agency Annual Performance Plans, GAO/GGD-10.1.20 (Washington, D.C.: April 1998), pp. 40-41.

2 IBM Global Business Services, National Science Foundation Performance Measurement Verification and Validation Report, Fiscal Year 2011. October 25, 2011

Data Sources, Limitations, and Intended Use

The data and information required to measure progress towards NSF’s performance goals in FY 2011 and later years fall into three broad categories

  • NSF automated administrative systems. Performance monitoring can be a valuable secondary function of such systems. In FY 2011, reporting included data from systems that:
    • store and approve publications such as solicitations announcements, and Dear Colleague letters;
    • collect transactional data about proposal and award management;
    • perform financial transactions;
    • store human resources data; and
    • permit keyword search of abstract or full texts of proposals and awards.

The data were used either directly or for achieving milestones that involve the writing of a report. While not all goals require a high level of accuracy, data from these systems are highly reliable.

  • Reports on internal activities. Milestone achievement is often determined from review of records of certain activities and events. Records of this sort tend to be compiled from review of the evidence provided by goal leaders.
  • Data requests of external parties. Qualitative or quantitative information is solicited directly from awardees.

Other parts of the Annual Performance Report required by the GPRA Modernization Act:

Use of non-federal parties in preparation of this report: None

Classified appendices not available to the public: None 

FY 2010 – FY 2011 Priority Goal Report

http://nsf.gov/about/budget/fy2013

Page statistics
3998 view(s) and 127 edit(s)
Social share
Share this page?

Tags

This page has no custom tags.
This page has no classifications.

Comments

You must to post a comment.

Attachments