Table of contents
  1. Story
  2. NextGen Conference Speaker Contest
  3. Tweets
  4. Spotfire Dashboard
  5. Research Notes
  6. Thank You for Attending Our Government Big Data Forum
  7. Come Interact With The Government Big Data Community 4 April 2013
  8. Third Annual Government Big Data Forum
  9. CTOvision Big Data Reporting for 2012: CTOs want discipline in the language of sensemaking
  10. Third Annual Government Big Data Forum
  11. Government Big Data: What's next?
    1. Training Certificate
    2. Speakers Bio
      1. Shawn Kingsberry, CIO, Recovery Accountability and Transparency Board
      2. Marina Martin, Head, Education Data Initiative, Education Department
      3. Gary Newgaard, Director of Federal Solutions, EMC Isilon
    3. Slides
      1. Slide 1 Title Government Big Data What's Next
      2. Slide 2 Today's Speakers
      3. Slide 3 Housekeeping
      4. Slide 4 Government Big Data What's Next
      5. Slide 5 Open Data Anonymized, Aggregated
      6. Slide 6 My Data Personal Smart Disclosure
      7. Slide 7 Open Data Overview
      8. Slide 8 Machine Readable CSV & API
      9. Slide 9 Human Readable
      10. Slide 10 Standardized Common Descriptive Metadata
      11. Slide 11 Federated
      12. Slide 12 My Data Overview
      13. Slide 13 Blue Button
      14. Slide 14 Green Button
      15. Slide 15 My Data in Education
      16. Slide 16 RATB Cloud Services High Level Technical Briefing
      17. Slide 17 RATB Cloud Service Logical Architecture
      18. Slide 18 RATB Logical System Design
      19. Slide 19 RATB High Level Technologies
      20. Slide 20 RATB Enterprise Architecture of the Future
      21. Slide 21 Data Governance
      22. Slide 22 Advanced Analytics Cloud 1
      23. Slide 23 Advanced Analytics Cloud 2
      24. Slide 24 Cloud Hub Categorization
      25. Slide 25 RATB Cloud Service Websites 1
      26. Slide 26 RATB Cloud Service Websites 2
      27. Slide 27 RATB Cloud Service Websites 3
      28. Slide 28 RATB Cloud Service Websites 4
      29. Slide 29 RATB Contact Information
      30. Slide 30 EMC Isilon Scale Out NAS
      31. Slide 31 Agenda
      32. Slide 32 What Is Big Data?
      33. Slide 33 The Big Data Challenge
      34. Slide 34 CIOs Turning to Scale-Out to Deal with Massive File-Data Growth
      35. Slide 35 Big Data Apps Need Big Data Storage
      36. Slide 36 Big Data Projects in the Federal Sector
      37. Slide 37 Examples of Federal Sector Big Data
      38. Slide 38 Sample of Government Accounts Using Isilon's Unified Scale-Out Storage...and Whty?
      39. Slide 39 EMC Isilon Growing Momentum Healthcare and Life Sciences
      40. Slide 40 The EMC Isilon Difference
      41. Slide 41 "Never Refresh Again" Architecture
      42. Slide 42 The Cost Advantage of Isilon 1
      43. Slide 43 The Cost Advantage of Isilon 2
      44. Slide 44 Isilon Scale-Out NAS Architecture
      45. Slide 45 Largest and Most Scalable File System
      46. Slide 46 Markets and Solutions
      47. Slide 47 Questions ?
  12. Unlocking the Power of Government Analytics
    1. Cover Page
    2. About GovLoop
    3. Executive Summary
      1. Are you using analytics in your agency?
    4. Unlocking the Power of Government Analytics
      1. Expert Insights: Three Steps to Implement Analytics
        1. How to Implement an Analytics Program
          1. Step One
          2. Step Two
          3. Step Three
        2. Adobe & Digital Government
    5. Overview of Government Analytics
      1. What are the benefits of analytics in government?
      2. What are some of the challenges for government analytics?
        1. Volume of data
        2. Variety
        3. Velocity
        4. Identifying value
    6. Data Driven Decisions: How Analytics Transforms Services
      1. A case study from Evanston Public Library
        1. 3 Lessons Learned from Evanston Public Library
          1. 1. Take Emotion Out of Decision Making
          2. 2. Use Data to Define Customer Needs and Demands
          3. 3. Set Success Metrics
    7. Increasing Transparency Initiatives Through Analytics
      1. A case study from Maryland StateStat
        1. 5 Lessons Learned from Maryland State Stat Case Study
          1. 1. Know Your Agency
          2. 2. Have Realistic Goals
          3. 3. Hold People Accountable and Recognize the Importance of Leadership
          4. 4. Show a Clear Business Value
          5. 5. Collect Right Data and Develop Baselines
    8. Fighting Waste, Fraud and Abuse through Analytics
      1. A case study from Centers for Medicare & Medicaid Services (CMS)
        1. Anomaly Detection Models
        2. Predictive Analytics
        3. Social Network Analysis
    9. Ten Steps to Leveraging Analytics in the Public Sector
      1. 1. Collaborate with peers and work across silos to share data and resources
      2. 2. Strong leadership needed
      3. 3. Clear vision of how to use analytics to solve a problem
      4. 4. Identify the right data
      5. 5. Address challenges for implementation
      6. 6. Assess workforce and build analytics support team
      7. 7. Pilot – start small and build quick wins
      8. 8. Choose the proper reporting techniques
      9. 9. Map to strategic needs at agency
      10. 10. Understand any cultural shifts that may take place
    10. 5 Questions to Ask for Analytics Initiatives
      1. Do employees have access to the right information?
      2. What problems are we trying to solve or what service needs improvement?
      3. How do agencies connect the dots and access other data?
      4. What new value has been created for customers and what’s the return on investment?
      5. How do we show that value to leadership and change culture with data?
    11. Final Thoughts & Acknowledgements
    12. GovLoop Resources
    13. Back Cover Page
  13. NEXT

Third Annual Government Big Data Forum

Last modified
Table of contents
  1. Story
  2. NextGen Conference Speaker Contest
  3. Tweets
  4. Spotfire Dashboard
  5. Research Notes
  6. Thank You for Attending Our Government Big Data Forum
  7. Come Interact With The Government Big Data Community 4 April 2013
  8. Third Annual Government Big Data Forum
  9. CTOvision Big Data Reporting for 2012: CTOs want discipline in the language of sensemaking
  10. Third Annual Government Big Data Forum
  11. Government Big Data: What's next?
    1. Training Certificate
    2. Speakers Bio
      1. Shawn Kingsberry, CIO, Recovery Accountability and Transparency Board
      2. Marina Martin, Head, Education Data Initiative, Education Department
      3. Gary Newgaard, Director of Federal Solutions, EMC Isilon
    3. Slides
      1. Slide 1 Title Government Big Data What's Next
      2. Slide 2 Today's Speakers
      3. Slide 3 Housekeeping
      4. Slide 4 Government Big Data What's Next
      5. Slide 5 Open Data Anonymized, Aggregated
      6. Slide 6 My Data Personal Smart Disclosure
      7. Slide 7 Open Data Overview
      8. Slide 8 Machine Readable CSV & API
      9. Slide 9 Human Readable
      10. Slide 10 Standardized Common Descriptive Metadata
      11. Slide 11 Federated
      12. Slide 12 My Data Overview
      13. Slide 13 Blue Button
      14. Slide 14 Green Button
      15. Slide 15 My Data in Education
      16. Slide 16 RATB Cloud Services High Level Technical Briefing
      17. Slide 17 RATB Cloud Service Logical Architecture
      18. Slide 18 RATB Logical System Design
      19. Slide 19 RATB High Level Technologies
      20. Slide 20 RATB Enterprise Architecture of the Future
      21. Slide 21 Data Governance
      22. Slide 22 Advanced Analytics Cloud 1
      23. Slide 23 Advanced Analytics Cloud 2
      24. Slide 24 Cloud Hub Categorization
      25. Slide 25 RATB Cloud Service Websites 1
      26. Slide 26 RATB Cloud Service Websites 2
      27. Slide 27 RATB Cloud Service Websites 3
      28. Slide 28 RATB Cloud Service Websites 4
      29. Slide 29 RATB Contact Information
      30. Slide 30 EMC Isilon Scale Out NAS
      31. Slide 31 Agenda
      32. Slide 32 What Is Big Data?
      33. Slide 33 The Big Data Challenge
      34. Slide 34 CIOs Turning to Scale-Out to Deal with Massive File-Data Growth
      35. Slide 35 Big Data Apps Need Big Data Storage
      36. Slide 36 Big Data Projects in the Federal Sector
      37. Slide 37 Examples of Federal Sector Big Data
      38. Slide 38 Sample of Government Accounts Using Isilon's Unified Scale-Out Storage...and Whty?
      39. Slide 39 EMC Isilon Growing Momentum Healthcare and Life Sciences
      40. Slide 40 The EMC Isilon Difference
      41. Slide 41 "Never Refresh Again" Architecture
      42. Slide 42 The Cost Advantage of Isilon 1
      43. Slide 43 The Cost Advantage of Isilon 2
      44. Slide 44 Isilon Scale-Out NAS Architecture
      45. Slide 45 Largest and Most Scalable File System
      46. Slide 46 Markets and Solutions
      47. Slide 47 Questions ?
  12. Unlocking the Power of Government Analytics
    1. Cover Page
    2. About GovLoop
    3. Executive Summary
      1. Are you using analytics in your agency?
    4. Unlocking the Power of Government Analytics
      1. Expert Insights: Three Steps to Implement Analytics
        1. How to Implement an Analytics Program
          1. Step One
          2. Step Two
          3. Step Three
        2. Adobe & Digital Government
    5. Overview of Government Analytics
      1. What are the benefits of analytics in government?
      2. What are some of the challenges for government analytics?
        1. Volume of data
        2. Variety
        3. Velocity
        4. Identifying value
    6. Data Driven Decisions: How Analytics Transforms Services
      1. A case study from Evanston Public Library
        1. 3 Lessons Learned from Evanston Public Library
          1. 1. Take Emotion Out of Decision Making
          2. 2. Use Data to Define Customer Needs and Demands
          3. 3. Set Success Metrics
    7. Increasing Transparency Initiatives Through Analytics
      1. A case study from Maryland StateStat
        1. 5 Lessons Learned from Maryland State Stat Case Study
          1. 1. Know Your Agency
          2. 2. Have Realistic Goals
          3. 3. Hold People Accountable and Recognize the Importance of Leadership
          4. 4. Show a Clear Business Value
          5. 5. Collect Right Data and Develop Baselines
    8. Fighting Waste, Fraud and Abuse through Analytics
      1. A case study from Centers for Medicare & Medicaid Services (CMS)
        1. Anomaly Detection Models
        2. Predictive Analytics
        3. Social Network Analysis
    9. Ten Steps to Leveraging Analytics in the Public Sector
      1. 1. Collaborate with peers and work across silos to share data and resources
      2. 2. Strong leadership needed
      3. 3. Clear vision of how to use analytics to solve a problem
      4. 4. Identify the right data
      5. 5. Address challenges for implementation
      6. 6. Assess workforce and build analytics support team
      7. 7. Pilot – start small and build quick wins
      8. 8. Choose the proper reporting techniques
      9. 9. Map to strategic needs at agency
      10. 10. Understand any cultural shifts that may take place
    10. 5 Questions to Ask for Analytics Initiatives
      1. Do employees have access to the right information?
      2. What problems are we trying to solve or what service needs improvement?
      3. How do agencies connect the dots and access other data?
      4. What new value has been created for customers and what’s the return on investment?
      5. How do we show that value to leadership and change culture with data?
    11. Final Thoughts & Acknowledgements
    12. GovLoop Resources
    13. Back Cover Page
  13. NEXT

  1. Story
  2. NextGen Conference Speaker Contest
  3. Tweets
  4. Spotfire Dashboard
  5. Research Notes
  6. Thank You for Attending Our Government Big Data Forum
  7. Come Interact With The Government Big Data Community 4 April 2013
  8. Third Annual Government Big Data Forum
  9. CTOvision Big Data Reporting for 2012: CTOs want discipline in the language of sensemaking
  10. Third Annual Government Big Data Forum
  11. Government Big Data: What's next?
    1. Training Certificate
    2. Speakers Bio
      1. Shawn Kingsberry, CIO, Recovery Accountability and Transparency Board
      2. Marina Martin, Head, Education Data Initiative, Education Department
      3. Gary Newgaard, Director of Federal Solutions, EMC Isilon
    3. Slides
      1. Slide 1 Title Government Big Data What's Next
      2. Slide 2 Today's Speakers
      3. Slide 3 Housekeeping
      4. Slide 4 Government Big Data What's Next
      5. Slide 5 Open Data Anonymized, Aggregated
      6. Slide 6 My Data Personal Smart Disclosure
      7. Slide 7 Open Data Overview
      8. Slide 8 Machine Readable CSV & API
      9. Slide 9 Human Readable
      10. Slide 10 Standardized Common Descriptive Metadata
      11. Slide 11 Federated
      12. Slide 12 My Data Overview
      13. Slide 13 Blue Button
      14. Slide 14 Green Button
      15. Slide 15 My Data in Education
      16. Slide 16 RATB Cloud Services High Level Technical Briefing
      17. Slide 17 RATB Cloud Service Logical Architecture
      18. Slide 18 RATB Logical System Design
      19. Slide 19 RATB High Level Technologies
      20. Slide 20 RATB Enterprise Architecture of the Future
      21. Slide 21 Data Governance
      22. Slide 22 Advanced Analytics Cloud 1
      23. Slide 23 Advanced Analytics Cloud 2
      24. Slide 24 Cloud Hub Categorization
      25. Slide 25 RATB Cloud Service Websites 1
      26. Slide 26 RATB Cloud Service Websites 2
      27. Slide 27 RATB Cloud Service Websites 3
      28. Slide 28 RATB Cloud Service Websites 4
      29. Slide 29 RATB Contact Information
      30. Slide 30 EMC Isilon Scale Out NAS
      31. Slide 31 Agenda
      32. Slide 32 What Is Big Data?
      33. Slide 33 The Big Data Challenge
      34. Slide 34 CIOs Turning to Scale-Out to Deal with Massive File-Data Growth
      35. Slide 35 Big Data Apps Need Big Data Storage
      36. Slide 36 Big Data Projects in the Federal Sector
      37. Slide 37 Examples of Federal Sector Big Data
      38. Slide 38 Sample of Government Accounts Using Isilon's Unified Scale-Out Storage...and Whty?
      39. Slide 39 EMC Isilon Growing Momentum Healthcare and Life Sciences
      40. Slide 40 The EMC Isilon Difference
      41. Slide 41 "Never Refresh Again" Architecture
      42. Slide 42 The Cost Advantage of Isilon 1
      43. Slide 43 The Cost Advantage of Isilon 2
      44. Slide 44 Isilon Scale-Out NAS Architecture
      45. Slide 45 Largest and Most Scalable File System
      46. Slide 46 Markets and Solutions
      47. Slide 47 Questions ?
  12. Unlocking the Power of Government Analytics
    1. Cover Page
    2. About GovLoop
    3. Executive Summary
      1. Are you using analytics in your agency?
    4. Unlocking the Power of Government Analytics
      1. Expert Insights: Three Steps to Implement Analytics
        1. How to Implement an Analytics Program
          1. Step One
          2. Step Two
          3. Step Three
        2. Adobe & Digital Government
    5. Overview of Government Analytics
      1. What are the benefits of analytics in government?
      2. What are some of the challenges for government analytics?
        1. Volume of data
        2. Variety
        3. Velocity
        4. Identifying value
    6. Data Driven Decisions: How Analytics Transforms Services
      1. A case study from Evanston Public Library
        1. 3 Lessons Learned from Evanston Public Library
          1. 1. Take Emotion Out of Decision Making
          2. 2. Use Data to Define Customer Needs and Demands
          3. 3. Set Success Metrics
    7. Increasing Transparency Initiatives Through Analytics
      1. A case study from Maryland StateStat
        1. 5 Lessons Learned from Maryland State Stat Case Study
          1. 1. Know Your Agency
          2. 2. Have Realistic Goals
          3. 3. Hold People Accountable and Recognize the Importance of Leadership
          4. 4. Show a Clear Business Value
          5. 5. Collect Right Data and Develop Baselines
    8. Fighting Waste, Fraud and Abuse through Analytics
      1. A case study from Centers for Medicare & Medicaid Services (CMS)
        1. Anomaly Detection Models
        2. Predictive Analytics
        3. Social Network Analysis
    9. Ten Steps to Leveraging Analytics in the Public Sector
      1. 1. Collaborate with peers and work across silos to share data and resources
      2. 2. Strong leadership needed
      3. 3. Clear vision of how to use analytics to solve a problem
      4. 4. Identify the right data
      5. 5. Address challenges for implementation
      6. 6. Assess workforce and build analytics support team
      7. 7. Pilot – start small and build quick wins
      8. 8. Choose the proper reporting techniques
      9. 9. Map to strategic needs at agency
      10. 10. Understand any cultural shifts that may take place
    10. 5 Questions to Ask for Analytics Initiatives
      1. Do employees have access to the right information?
      2. What problems are we trying to solve or what service needs improvement?
      3. How do agencies connect the dots and access other data?
      4. What new value has been created for customers and what’s the return on investment?
      5. How do we show that value to leadership and change culture with data?
    11. Final Thoughts & Acknowledgements
    12. GovLoop Resources
    13. Back Cover Page
  13. NEXT

Story

Slides

Big Data in the Age of Analytics: From Infrastructure to Sense Making

Highlights I noted were:

Welcome Bob Gourley, Editor, CTOVision

Third year of this and more exciting each year. See Tweets and Exhibits.

Introduction and Morning Keynote Kirit Amin, Deputy CIO & Chief Technology Officer, US Department of Commerce

It is more about People, Culture, and Policy: Need  Chief Data Officer, leading an interagency team of Data Scientists working on valuable problems for government and citizens, and change in laws to allow cross-agency data integration. This is happening with Hudson Hollister's DataCoalition.org's work on the Data Act of 2013, and Dominic Sale, OMB's new Chief Analytics Officer, Digital Government Strategy of "all content is data".

Panel 1: Infrastructure Needs and Capabilities, Moderator: Abe Usher, Chief Innovation Officer, The HumanGeo Group, LLC

The Kitchen Model of Big Data: Chefs, Ingredients, Utensils, and Recipes. The last two are our panel focus. With Kaggle's Data Science Services and Todd Park's Health Datapalooza IV Apps Competition you may not need your own infrastrucuture and capabilities. Chef's are Data Scientist, Ingredients are Data Specified or Needed, Utensils and Recipes are the Tools and Experience used by the Data Scientists, respectively. Might see Data Science Competitions on TV like Chef Competitons on Cooking Shows!

Need three capabilities: Get to the Data Centers, Make the Data Centers Interoperable, and Get the Data Centers Talking to One Another. (I can do this with Spotfire 5.5.) Panelists like my suggestion of Data Science as a Service.

Networking Break and Exhibits

Lots of handouts and demos to see.

Panel 2: Analytical Tools Moderator: Al Di Leonardo, President & CEO, The HumanGeo Group, LLC

It is about People, Process, and Technology. We need business cases, "sticky data" (data that sticks to people like places, money, etc.) and better data at big data scale.

Ignite Sessions: Fast Overviews of Emerging Technologies Moderator: Fran Landorf , Principal, Core Consulting, LLC

Gartner says Government has the biggest problems with big data

Lunch

Panel 3: Data Visualization Moderator: Bob Gourley, Editor, CTOVision

At a recent Big Data Technology Symposium, Bill Franks, Teredata Chief Analytics Officer and author of "Taming The Big Data Tidal Wave", said to emphasize the 4th V (value) and quickly visualize the data to see if it contains useful information and relationships, and if so then use that portion of it to add stronger relationships to it with other big data.

Panel 4: The Emerging Role of the Data Scientist Moderator: Bob Gourley, Editor, CTOVision

Reminds me of my three experiences in becoming a Data Scientist: First, I was the US EPA's first data architect for years but really wanted to work with the data and then got a new CIO and told him I wanted to be a data scientist and he said go for it 'I used Business Intelligence when in private industry' and it helped. Second, I met Leticia Long, Direcector of the NGIA, who wanted people who could "Put Geoint in the hands of users (global disater reponse community) and free up the time of our analysts to do deeper analytics" and I did that in a Data Science way. Third, I asked senior government officials ((gus Hunt, Robert Ames, George Strawn, Todd Park, and Chris Greer) what a data scientist could do of value and they told me to audit, access, and improve what had been done with government data and write about it as a data journalist which is what I have done for the past two years. See my recent example with Census Big Data Visualizations.

Afternoon Keynote Shawn Kingsberry, Assitant Director of Information Technology and CIO, Recovery Acccountability and Transparency Board (RATB)

RATB has built a big data infrastructure for OMB and Congress and with the new Hurricane Sandy Relief legislation with be able to analyze agency data for waste, fraud, and abuse. Reminded him that my clients (AOL.gov) paid me to audit and asses how welll Recovery.gov, etc. were doing to deliver as advertized.

Closing Keynote Donna Roy, Executive Director, Information Sharing Environment, Office of the CIO, Department of Homeland Security

NIEM and the ISE is for secure information sharing and does not apply to White House Open Government Data and Transparency Initiatives. Donna Roy has engaged in a number of Linkedin discussions recently about the value and future of NIEM and the ISE.

My Tweet: NIEM as Big Data in a Network with Data Science: http://semanticommunity.info/NIEM @NIEMExecDir @RockinNIEM @shareandprotect, 04:44 PM - 25 Feb 13

Was Retweeted by Donna Roy @NIEMExecDir To 808 followers. She is a keynote speaker at this Third Annual Government Big Data Forum.

I responded to call for presentations by Bob Gourley: Please help us evaluate the right issues to address and help us queue up a research agenda for the next year. And let us know if you have a lesson learned or topic you can address with the group or if you are available to speak on a panel with the matirx below.

 

Four Vs Concept Method Goal Result
Volume and Velocity Big Data=All Content Make all content as data. Federal Digital Government Strategy Knowledge Base
Veracity Web-Linked Data Semantic Web Data Strong Relationships Spreadsheets
Value Unified Data Architecture Data Integration Data Ecosystem Network Visualizations

During a good discussion.about Spreadsheets are better than UML, I commented: "So we really need to deal with big data (Dominic Sale, new OMB Chief of Data Analytics & Reporting said the new Digital Government Strategy is "treating all content as data." See:http://semanticommunity.info/Big_Data_Symposia#Speaker_Bio_for_Dominic_Sale

So big data = all your content and here is my matrix for the Third Annual Government Big Data Forum next week:http://semanticommunity.info/Third_Annual_Government_Big_Data_Forum#Story

So the Knowledge Base gives us a semantic index to improve search over all the content and the data sets in a tool like Spotfire with over 30 connectors to spreadsheets to data warehouses to RDF in the new Cray Graph Computer gives us agile data in-memory for analytics."

See: Slide 36 Big Data Projects in the Federal Sector

NextGen Conference Speaker Contest

Source: http://www.nextgengovt.com/blog

Name and agency: Brand Niemann, Former Senior Enterprise Architect and Data Scientist, US EPA

Title of Talk: Big Data Exploitation for Government Questions and Answers

150-200 word essay on why you should be chosen: Last week I was interviewed by both Government Computers News for their 30th Anniversary Issue Articles and by the Potomac Institute for Policy Studies for their survey on this subject because "you were a data scientist with the government for many years and you are a visionary, a pretty good combination for these articles and surveys".

So I cover many big data conferences as a data scientist and data journalist and use that experience to answer questions and do data analytics with government data. Two specific examples I will talk about are:

Big Data in the Age of Analytics: From Infrastructure to Sense Making: http://semanticommunity.info/Third_Annual_Government_Big_Data_Forum

A Data Scientist's Audition for the Recovery Independent Advisory Panel:

http://semanticommunity.info/FederalTransparency.gov#Story

You will learn how to either become a data scientist/data journalist yourself or recognize how to use one in your government big data work from this talk.

Background:

You love government and have an idea to make it better: Yes, I did that for 30+ years as a government employee and now for the US, EU, and Japanese governments.

You get to shine in front of important people and peers: Yes, I have had lots of opportunities to do that in my federal government career and now.

You get to go to the best training, hands down, for government for FREE and as a VIP: Yes, I do that now as a data scientist/data journalist writing data stories for AOL Government, Federal Computer Week, Government Computer News, Semantic Community, Japanese Government, etc.

Tweets

Source: https://twitter.com/search/realtime/%23GovBDF

Results for #GovBDF

Tweets Top / All / People you follow

  1. Big Data: need a consistent method to identify items for equitable trade arbitrage in a mesh fabric / matrix?

  2. Rt : At , ' Roy: info sharing must be limited only by policy, not technology.

  3. Please give us your thoughts on the Third Annual Government Big Data Forum:

  4. to the team who made such a great success yesterday. Well done!

  5. Enjoyed the presentations, Q&A, demos, vendors, and conversations at Thanks everyone. Keep the conversations and solutions going!

  6. Fantastic day with Government Forum. Excellent panelists and keynote speakers. | |

  7. Had fun talking about big data at . many thanks for allowing me to speak.

  8. at : Mass. Supreme Court had ruled that = personally identifiable info.

  9. Here is my friend Harold telling us his definition of .

  10. Donna I know everyone at our Big Data Forum appreciated hearing from you today. Thanks!

  11. Shawn it was such an honor to hear from you today at our Big Data Forum

  12. Great context by @NEIMExecDir on data and the architecture and processes required for trust-based information sharing.

  13. At Government Forum: Donna Roy, Office of CIO, . (@ renhotels) [pic]:

  14. .@NEIMExecDir takes the stage as our final keynote at today's Government Forum.

  15. At , ' Roy: info sharing must be limited only by policy, not technology.

  16. At , ' Roy: , Governance, Architecture - all inform US nat'l strategy

  17. Next up at : Donna Roy, Office of CIO, .

  18. For all at Gov Big Data Forum, here is the great articulation of what a data scientist is:

  19. . utilizes 's BusinessObjects solution among others in its data visualization efforts.

  20. Using contracted integrators Amazon & , is able to keep its sites running even in extreme power outages.

  21. So cool and motivational learning from of talk about end to end IT needs and his architecture at

  22. I tried to paraphrase you regaring how you define a data scientist, but think I screwed it up. How do you define it?

  23. Challenge No. 1 is managing . -

  24. If federal doesn't know where cloud fits, there in lies the opportunity (for evangelists). -

  25. Exposing a to a data problem, you see their discipline & the value of what they do. - 's Joey Echeverria

  26. What makes a good data scientist? Curiosity above all else, says Joey Esceverria (Cloudera)

  27. USSOCOM LTC Alex Isaac: User acceptance is a deal breaker

  28. .'s LTC Alex Isaac believes data that's sticky & contains critical details is key for best .

  29. Data isn't friendly to assumptions. Have a good understanding of data in a visualization environment. - Bill Ladd

  30. Context is critical in dataviz, Lisa Sokol from IBM, Matt Schumpert of Datameer

  31. “Ian Fyfe: you can do real-time analytics on transactional data using in-memory DBMS. [And enables this beyond in-mem.]

  32. For folks at , are you enjoying learning about piXlogic? Here are some of my thoughts on them:

  33. MT : By 2018 there could be a shortage of 200k . - Brady Balls on & challenges

  34. Analytics as a service makes consumable and commoditized

  35. By 2018 there could be a shortage of 200k data analysts-Brady Balls quick presentation on inTTENSITY and big data challenges

  36. I just got the best overview of I have ever seen by Will of at the Government Big Data Forum

  37. In DC? Join us at the Government Big Data Forum. Our very own Matt will be sharing insights on the Data Visualization panel.

  38. Danielle Forsyth, Thetus: Frustration with security and data standards is the velocity of their changes.

  39. You can't solve complex problems with vertical silos. - 's Dante Ricci

  40. MT : Ian Fyfe, : "Cleaning" data can remove the signal. E.G., duplicates may be significant.

  41. Ian Fyfe: you can do real-time analytics on transactional data using in-memory DBMS. [CB: And MarkLogic enables this beyond in-mem.]

  42. I am a huge fan of and their CEO Danielle is killing it at this panel on analytical tools. Thetus = great capability.

  43. Ian Fyfe, Pentaho: "Cleaning" data can remove the signal. E.G., duplicates may be significant.

  44. Tariq Kara DHS/ICE: gov could benefit from common terminology

  45. Thanks my friend. I feel like I"m with family here. These are great great people at

  46. Tariq Kara, DHS: we have variety and velocity, not as much volume. [That can still be a Big Data problem, harder than mere volume.]

  47. panel: "Cloud" sounds soft and fluffy. But it's a sharp tool, and if any layer is poorly provided, all will be slow or unreliable

  48. Thx RT : Tremendous effort put out by ISE and to share data and improve standards and processes

  49. Abe Usher Says "BigData now is like plastics in 50's"

  50. We need a truly agile approach to successfully address our big data challenges ACS agrees!

  51. The goal is interoperability & making mobile. - 's Joshua Beadr

  52. For friends at Gov Big Data Forum Hear more from of at podcast

  53. For friends at Gov Big Data Forum Hear more from of at podcast

  54. Users expect Google-like speed of big data systems

  55. Thanks, ! There are certainly a lot of great discussions around & happening at . Love today's energy.

  56. Big Data is the new plastics of today. We are still realizing its potential. Paul Brown NSA on Infrastructure Needs panel

  57. Committing to 2-3 year projects sets yourself up for the most expensive bug - Your requirements. - 's

  58. If you are at the Government Big Data Forum pls go thank for having the idea to start doing these. And follow him!

  59. Do a search on NoSQL to see how an idea starts and gains momentum

  60. Be careful to remember people and not to over-focus on technology. Big data helps focus on people and enable the mission outcomes

  61. Infrastructure needs and capabilities panel with Abe Usher moderating

  62. You can find a lot of our sponsors on Twitter! Platinum sponsors include & !

  63. Kirit Amin"We need some Yodas" to set the culture & vision across the govt, groom the next generation. Always enjoy a star wars ref

  64. Kirit Amin cites legislation & policy as an inhibitor of faster & tech adoption in .

  65. Quick pop in at Crystal City. Next stop Farragut West and some awesomeness at .

  66. We'll second that. MT : Wish you were here Dave. RT : Bummed to be missing this years .

  67. Kirit Amin: data standard is delivering for Information Sharing Env (ISE) across gov, XML services for architecture.

  68. Tremendous effort put out by ISE and NIEM to share data and improve standards and processes

  69. There is a tremendous effort in info sharing over the last 25 years. - 's Deputy Kirit Amin

  70. Listening to Kirit Amin kick off today's

  71. Thanks for that photo. Can you photoshop that. I wish I was taller, wish I was a baller.

  72. Technology must follow the mission...Kirit Amin keynote at

  73. Morning Keynote: Kirit Amin, Deputy CIO & CTO, US Dept of Commerce - technology: a walk down memory lane & fast forward to Cloud.

  74. Kirit Amin gives morning keynote. Reviewing OMB guidance, cloud, security, and challenges of technology

  75. Wish you where here Dave. RT : Bummed to be missing this years .

  76. Kirit Amin highlights our multigenerational workforce and the need for more trained personnel

  77. Processes, policies, and legal issues constrains BIG data in govt

  78. There are issues and challenges related to acquisition, training, and skills of the workforce - big data has a big human element

  79. Kirit Amin: For Big Data: first people, then process, last technology. Same with cyber threat--mostly internal and inadvertent.

  80. Can you tell us the twitter address of every sponsor here at the I want to make sure all have an easy way to follow.

  81. Do we think we have trained data personnel in our organizations and agencies?

  82. Data fatigue is a challenge in agencies

  83. Kirit Amin taking us down memory lane. When was the first domain name issued?

  84. Kirit Amin, Commerce Dept CTO: 2/3 of Big Data projects fail because they focus on tech, not problem. Carahsoft Big Data forum

  85. Bummed to be missing this years .

  86. Hey, attendees! Don't miss our sponsored big data book giveaway!

  87. Welcome, all, to this year's Government Forum!

  88. Arrived early for Cold morning in DC. Looking forward to the discussions and ideas.

  89. I just found on Twitter. Pls follow him to encourage him to post more. And look for him at the Big Data forum tomorrow.

  90. Attend tomorrow then come see in action at one of our workshops. April 23-25. Reg here:

  91. Learn the best practices for using tools in support of government at :

  92. Attending the Government Forum tomorrow in DC? Register before it’s too late!

  93. RT : 4 April 2013 Government Big Data Forum < will be representing

  94. Please review these: Final Preparations For The 4 April 2013 Government Big Data Forum

  95. Please review these: Final Preparations For The 4 April 2013 Government Big Data Forum

  96. Coming to the Gov Big Data Forum 4 Apr? This is required reading: 7 Thoughts on Hadoop’s Seventh Birthday

  97. Here is your chance to interact w/ great Gov thought leaders like Amin/Kingsberry/Roy

  98. Here is your chance to interact w/ great Gov thought leaders like Amin/Kingsberry/Roy

  99. Here is your chance to interact w/ great Gov thought leaders like Amin/Kingsberry/Roy

  100. Here is your chance to interact w/ great Gov thought leaders like Amin/Kingsberry/Roy

  101. Here is your chance to interact w/ great Gov thought leaders like Amin/Kingsberry/Roy

  102. What is the role of a data scientist in agencies? Find out April 4 at :

  103. Join editor & for this week's Government Forum on Thursday.

  104. What skills does a Scientist need? Find out at in Washington, DC on April 4:

  105. Can solutions work in ? Find out at in Washington, DC on April 4:

  106. Attend on 4 April then come see in action at one of our workshops. 23-25 April. Reg here:

  107. RT : We'll cover emerging def of during .. event April 4. > We'll be there!

  108. Our gov big data forum features greatest tech in globe, including: , , , , .

  109. Learn how new visualization can help make sense of large amounts of . Attend :

  110. We'll cover emerging definitions of during this year's event April 4.

Spotfire Dashboard

For Internet Explorer Users and Those Wanting Full Screen Display Use:  Web Player Get Spotfire for; iPa d App

Research Notes

Going as Press! Looked at https://twitter.com/search/realtime/%23GovBDF

Responded to call for presentations - should have save it!

2nd Big Data Forum: http://semanticommunity.info/AOL_Government/Intelligence_Community_Loves_Big_Data

http://www.govloop.com/profiles/blog...ata-scientists

http://www.govloop.com/profiles/blog...ment-analytics

Ths link does not work: http://Federalaccountability.gov

I have worked with the data from this link: http://www.recovery.gov/Pages/default.aspx

Look at this links for data: http://www.educationjobsfund.gov/Pages/home.aspx

Look at this links for data: http://www.federaltransparency.gov/Pages/default.aspx

Thank You for Attending Our Government Big Data Forum

Source: Email

MY NOTE: Dominic Sale's remarks about all content as data is at 12:54 in the video: http://nsp.performedia.com/afcea-bethesda/bdts13/gallery?sid=58078

Also see White Papers: https://www.signup4.net/Public/ap.aspx?EID=BIGD25E&TID=sANuzAr7cQGmui1pBNmudA%3D%3D

Thank you for attending our 2013 Government Big Data Forum. We value your opinion and would appreciate any thoughts you have on this event, or the overall Big Data community. Please click here to give us your feedback.

Attendees heard from government and industry experts about best practices for Big Data, emerging technologies and concepts designed to address the full spectrum of agency mission needs, and issues that may arise. Session topics included keynotes from the following speakers:

  • Kirit Amin, Deputy CIO and Chief Technology Officer, US Department of Commerce
  • Shawn Kingsberry, Assistant Director of Information Technology and Chief Information, Recovery Accountability and Transparency Board
  • Donna Roy, Executive Director, Information Sharing Environment, and Office of the CIO, Department of Homeland Security

The recording and presentation slides approved for release will be available within the next week at www.governmentbigdataforum.com. The site also features additional information on all of our speakers, as well as sponsor resources on the technologies presented. Click here if you would like to view pictures from the event.

If you have any questions about this event or would like more information, please contact the Big Data team by emailing bigdata@carahsoft.com

Sincerely,

Big Data Government at
Carahsoft Technology Corp.
888-662-2724 (Toll Free)

bigdata@carahsoft.com
www.governmentbigdataforum.com

Come Interact With The Government Big Data Community 4 April 2013

Source: http://ctovision.com/2013/03/come-interact-with-the-government-big-data-community-4-april-2012/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+typepad%2Fctovision%2Fcto_vision+%28CTOvision.com%29

The Government Big Data Forum has been gathering every year for three years now to discuss how to leverage Big Data for government mission impact. We dive deep into lessons learned, share issues/challenges, receive demos from hot technology firms and network with fellow practitioners in a day-long forum.

This year we are so excited about this event because of the terrific people who have signed up. We have attendees from across the federal government (all three branches!) and that should make for a terrific discussion. We have invited some of the most experienced practitioners and leaders of the community to address and interact with the group, including:

Kirit Amin
Deputy CIO and Chief Technology Officer
US Department of Commerce

Kirit Amin was named the U.S. Department of Commerce, Deputy Chief Information Officer and Chief Technology Officer, in November 2012.

Shawn Kingsberry
Assistant Director of Information Technology and Chief Information Officer
Recovery Accountability and Transparency Board

Shawn Kingsberry is the Assistant Director of Technology and the Chief Information Officer for the Recovery Accountability and Transparency Board.

Donna Roy
Executive Director, Information Sharing Environment
Office of the CIO, Department of Homeland Security

Donna Roy joined the DHS in December of 2006 and currently serves as the CIO’s Executive for Information Sharing. 

Plus stalwarts of the community from: NGA, NSA, CIA, and the US Army.

And we have hand selected speakers from some of the greatest technology firms in the big data/analysis/sensemaking community including:

ClouderaMarkLogicThetusDatameerIBM (Analyst Notebook), EMCSAPRecorded Future,Splunk10GenFlexAnalyticsParabalOptensityVMwareAristaVirtualInstrumentsQbase,Nutanix and Symantec

 

Third Annual Government Big Data Forum

Source: http://events.carahsoft.com/event-detail/2340/govloop/
Big Data in the Age of Analytics: From Infrastructure to Sense Making

Join us for the third annual Government Big Data Forum in Washington, D.C. as Carahsoft and Bob Gourley of CTOvision.com bring together thought leaders and experts from the federal Big Data community to continue the conversation around best practices for Big Data technology. This year's forum will examine emerging technologies and concepts designed to address the full spectrum of agency mission needs for Big Data. 

Announcing our Government Keynotes:

Kirit Amin
Deputy CIO and Chief Technology Officer
US Department of Commerce

Kirit Amin was named the U.S. Department of Commerce, Deputy Chief Information Officer and Chief Technology Officer, in November 2012.
 

 

Shawn Kingsberry
Assistant Director of Information Technology and Chief Information Officer
Recovery Accountability and Transparency Board

Shawn Kingsberry is the Assistant Director of Technology and the Chief Information Officer for the Recovery Accountability and Transparency Board.  


Donna Roy
Executive Director, Information Sharing Environment
Office of the CIO, Department of Homeland Security

Donna Roy joined the DHS in December of 2006 and currently serves as the CIO’s Executive for Information Sharing. 

 

Agenda

07:00am - 07:45am Registration and Breakfast
07:45am - 08:00am

Welcome

08:00am - 08:45am Introduction and Morning Keynote
08:45am - 09:30am

Panel 1: Infrastructure Needs and Capabilities

09:30am - 10:15am

Networking Break and Exhibits

10:15am - 11:00am Panel 2: Analytical Tools
11:00am - 11:45am Ignite Sessions: Fast Overviews of Emerging Technologies
11:45am - 12:30pm Lunch
12:30pm - 01:15pm

Panel 3: Data Visualization

01:15pm - 02:00pm

Panel 4: The Emerging Role of the Data Scientist

02:00pm - 02:45pm

Afternoon Keynote

03:00pm - 03:45pm Closing Keynote

Please note, agenda is subject to change.

Registration is from 7:00am to 7:45am. Discussion will start promptly at 7:45am. 

In preparation for this year's event we are encouraging all attendees to review these emerging definitions of Big Data derived from Gartner constructs. 

CTOvision Big Data Reporting for 2012: CTOs want discipline in the language of sensemaking

big data 620x400 150x150 CTOvision Big Data Reporting for 2012: CTOs want discipline in the language of sensemaking
This special report provides insights from a our reporting over the last 12 months, including summaries of our Government Big Data Newsletter (sign up for this weekly report here http://ctovision.com/newsletter-subscriptions)

Among the many Big Data themes we reported on in 2012, one seemed to resonate the most with our readers– all of us with a techie bent have realized that we need more discipline in our use of the term Big Data. We revisited this need for discipline in our post of:

Big Data Defined for 2013: A definition that can help in your interaction with the IT community

In it we suggest everyone follow the lead of the TechAmerica foundation and their use of Gartner constructs in defining Big Data. At CTOvision we will use the term Big Data this way:

Big Data: A phenomenon defined by the rapid acceleration in the expanding volume of high velocity, complex and diverse types of data. Big Data is often defined along three dimensions– volume, velocity and variety.

Big Data Solutions: Advanced techniques and technologies to enable the capture, storage, distribution, management and analysis of information.

Early in the year we provided insights for program managers that want to get a started with Big Data solutions. We gave quickstart tips on how you can stand up your own cluster in the cloud. We followed up with ways you can quickly use Whirr to automate that.

Through the year we published several pieces on topics associated with the ethics issues around Big Data. This included a series by Kord Davis who reported on topics like:

We reported extensively on new concepts for Big Data involving very large quantities of data in memory. The greatest expert in this field, Terracotta CEO Robin Gilthorpe, provided his views on Big Data Trends to watch in 2013 by a YouTube video we highlighted to our readers. His view is that requirements will drive the industry to several new highs and will include dramatic social change because of this. His five predictions for 2013 are:

  • Big Data will be fast data – Enterprises will profit from Big Data intelligence in proportion to how quickly they can act on it.
  • Rise of the hybrid cloud – It’s no longer about building your own platform; it’s more efficient to play in ecosystems.
  • CIOs and CMOs get a lot closer – Marketing spend on technology is about to eclipse IT spend on technology.
  • The Internet of things crosses the chasm – In just a few years, over 25 billion data-producing devices will be connected.
  • Social becomes part of life’s fabric – Remember e-business departments? Social will permeate in the same way.

We also wrote about new concepts for capture, storage, distribution and management of data via new concepts like dispersed compute storage. Solutions like this from Cleversafe (see Cleversafe: how does it really work?) are true game changers inserting dramatic improvements to security and functionality and doing so with a quick return on investment.

We reported on many other firms associated with the fielding of high quality Big Data solutions into the federal enterprise, including MarkLogicOracleDatameerClouderaTerracottaCleversafe,SplunkKapowSitscapeCloudFrontGroupClearStory, and Thetus. These firms are fielding real, working solutions for Big Data and we will be reporting more on them in 2013 we are sure.

Another clear theme in our reporting of 2012 on Big Data was the importance of mission focus. That is why we are all so excited about the new technical capabilities of Hadoop and the related technologies. It is about impact to mission. Which leads to the Government Big Data Solutions Award:

Our reporting on Big Data for 2012 included announcing the results of the Government Big Data Solutions Award. The Government Big Data Solutions Award was established to highlight innovative solutions and facilitate the exchange of best practices, lessons learned and creative ideas for addressing Big Data challenges. The Top Five Nominees of 2012 were chosen for criteria that included:

  • Focus on current solutions: The ability to make a difference in government missions in the very near term was the most important evaluation factor.
  • Focus on government teams: Industry supporting government also considered, but this is about government missions.
  • Consideration of new approaches: New business processes, techniques, tools, models for enhancing analysis are key.

Winner of the 2012 Government Big Data Solutions Award was the National Cancer Institute’s Frederick National Laboratory.

The NCI Funded Frederick National Laboratory has been using Big Data solutions in pioneering ways to support researchers working on complex challenges around the relationship between genes and cancers. In a  recent example, they have built infrastructure capable of cross-referencing the relationships between 17000 genes and five major cancer subtypes across 20 million biomedical publication abstracts.  By cross referencing TCGA gene expression data from simulated 60 million patients and miRNA expression for a simulated 900 million patients. The result: understanding additional layers of the pathways these genes operate in and the drugs that target them. This will help researchers accelerate their work in areas of importance for all humanity.  This solution, based on the Oracle Big Data Appliance with the Cloudera Distribution of Apache Hadoop (CDH), leverages capabilities available from the Big Data community today in pioneering ways that can serve a broad range of researchers. The promising approach of this solution is repeatable across many other Big Data challenges for bioinfomatics, making this approach worthy of its selection as the 2012 Government Big Data Solution Award.

We also reported on a classification framework for Big Data solutions produced by  in a very insightful post on Classifying Today’s “Big Data Innovators”.  This is an innovative approach that is easy to think through and should be repeatable for many vendors in this space, and should help enterprise technologists think through which vendors may be right for their mission needs.  In it he categorizes the 13 innovative Big Data innovators reported on by Information Week. They are:

1.  MongoDB
2.  Amazon (Redshift, EMR, DynamoDB)
3.  Cloudera (CDH, Impala)
4.  Couchbase
5.  Datameer
6.  Datastax
7.  Hadapt
8.  Hortonworks
9.  Karmasphere
10.  MapR
11.  Neo Technology
12.  Platfora
13.  Splunk

He classifies them into:

1.  Operational data stores that allow flexible schemas
2.  Hadoop distributions
3.  Real-time Hadoop-based analytical platforms
4.  Hadoop-based BI solutions

We will likely return to this classification for reporting in 2013.

What does our reporting over the last 12 months signal for the next 12 months? We believe we will see a continued expansion of the user end of big data solutions. It is probably an oversimplification to say it this way, but one way to look at is is that we have an approach to the backend infrastructure, and that is primarily one built on the Apache Hadoop framework of software over commodity IT integrated into existing but modern enterprise solutions. Their is room for innovation here of course but in general the path of the backend is set and will continue. The dynamic change to expect now is in the user-facing applications. Brace yourself! Changes there will be dynamic.

Third Annual Government Big Data Forum

Source: http://www.governmentbigdataforum.com/index.html

 

Third Annual Government Big Data Forum
Big Data in the Age of Analytics: From Infrastructure to Sense Making

Announcing our Morning Keynote: Kirit Amin, Deputy CIO & Chief Technology Officer at US Department of Commerce

With experience as the Chief Information Officer for the Bureau of Consular Affairs and Director for the Office of Consular Systems and Technology (CST) at the US Department of State, the Chief Technology & Innovation Officer at the Department of Housing and Urban Development, and his current position as Deputy CIO & Chief Technology Officer at the Department of Commerce, Kirit Amin will open the day by addressing unique insights into the lessons learned, remaining challenges, and gaps government organizations have in the domain of Big Data.

Join us for the third annual Government Big Data Forum in Washington, D.C. as Bob Gourley of CTOvision.com and Carahsoft bring together thought leaders and experts from the federal big data community to continue the conversation around best practices for Big Data technology. This year's forum will examine emerging technologies and concepts designed to address the full spectrum of agency mission needs for Big Data.

We have invited speakers from several different agencies to examine government infrastructure needs and changes to enable big data solutions, discuss industry best practices and recommendations for using analytical tools in support of government missions, new data visualization capabilities designed for large scale data challenges, and the role of the data scientist in the context of government agencies.

Discussion topics include:
  • What are best practices in engineering for a data enabled enterprise?
  • What new visualization capabilities can help users make sense of large quantities of data?
  • Can Big Data solutions work in real time?
  • What skills does a Data Scientist need, or should all users be empowered with tools so Data Scientists are not needed?

In preparation of this year's event we are encouraging all attendees to review these emerging definitions of Big Data derived from Gartner constructs:

Big Data: A phenomenon defined by the rapid acceleration in the expanding volume of high velocity, complex and diverse types of data. Big Data is often defined along three dimensions� volume, velocity, and variety.

Big Data Solutions: Advanced techniques and technologies to enable the capture, storage, distribution, management, and analysis of information.

Agenda

07:00am - 07:45am Registration and Breakfast
07:45am - 08:00am

Welcome

08:00am - 09:00am Introduction and Morning Keynote
09:00am - 09:45am

Panel 1: Infrastructure Needs and Capabilities

09:45am - 10:45am

Networking Break and Exhibits

10:45am - 11:30am Panel 2 - Analytical Tools
11:30am - 12:15pm   Ignite Sessions: Fast Overviews of Emerging Technologies
12:15pm - 01:00pm Lunch
01:00pm - 01:45pm

Panel 3: Data Visualization

01:45pm - 02:30pm

Panel 4: The Emerging Role of the Data Scientist

02:30pm - 03:30pm Afternoon Keynote and Conclusion

Please note, agenda is subject to change.

Registration is from 7:00am to 7:45am. Discussion will start promptly at 7:45am.

We look forward to seeing you there!

Questions? Contact:
Big Data Government at 
Carahsoft Technology Corp.
888-662-2724 (Toll-Free)
bigdata@carahsoft.com

Join the Government Big Data Forum conversation on Twitter using our event hashtag #GovBDF.

Register Today!

Date: April 4, 2013
Time: 7:00am - 3:30pm
Location:
Mayflower Renaissance Hotel
1127 Connecticut Avenue NW
Washington, D.C. 20036
Map

Sponsors
 

Government Big Data: What's next?

Training Certificate: https://wcc.on24.com/event/59/42/05/...ertificate.pdf

Slide Deck: https://wcc.on24.com/event/59/42/05/...ata_slides.pdf

New GovLoop Report: Unlocking the Power of Government Analytics: http://www.govloop.com/profiles/blog...ment-analytics

Game-Changing Technologies for a New Paradigm: https://wcc.on24.com/event/59/42/05/...angingtech.pdf

EMC ISILON ONEFS OPERATING SYSTEM: https://wcc.on24.com/event/59/42/05/...s_overview.pdf

Data Surge and Automated Analysis: The Latest ISR Challenge: https://wcc.on24.com/event/59/42/05/..._challenge.pdf

TOP 5 REASONS TO CHOOSE EMC ISILON SCALE-OUT NAS SOLUTIONS: https://wcc.on24.com/event/59/42/05/...ssolutions.pdf

IDC on EMC: ://wcc.on24.com/event/59/42/05/rt/1/documents/resourceList1363726318513/idc_scale_out_marketscape_.pdf

EMC Storage: ://wcc.on24.com/event/59/42/05/rt/1/documents/resourceList1363726340271/scale_out_storage_for_government.pdf

A Path to Insights and Improved Decision Making: Predictive Analytics: http://www.govloop.com/group/analyti...dictive-analyt

Does Big Data REALLY Matter?: http://www.govloop.com/profiles/blog...-really-matter

3 Keys to Big Data: Quick Wins, Clear Scope, Communicate: http://www.govloop.com/profiles/blog...pe-communicate

Big Data: A Big Problem That's Getting Bigger: http://www.govloop.com/profiles/blog...getting-bigger

The good news is you can find the archived version here:

WEBCAST LINK:  http://w.on24.com/r.htm?e=594205&s=1...C1ED1AE4C5D40C
EMAIL:  bniemann@cox.net

You'll also be able to access a copy of the slide deck and download your GovLoop training certificate at the link above.

Interested in more infornation? Be sure to check out GovLoop's new guide: Unlocking the Power of Government Analytics.

http://event.on24.com/eventRegistrat...7&mode=launch#

Also, mark your calendars for the third annual Government Big Data Forum in Washington, D.C. Join Carahsoft and Bob Gourley of CTOvision.com as they continue the conversation around best practices for Big Data technology. This year's forum will examine emerging technologies and concepts designed to address the full spectrum of agency mission needs for Big Data. Learn more here!

Thanks again, Steve Ressler (aka Mr. GovLoop)

Training Certificate

Training Certificate: https://wcc.on24.com/event/59/42/05/...ertificate.pdf (PDF)

TrainingCertificate.GIF

Speakers Bio

Shawn Kingsberry, CIO, Recovery Accountability and Transparency Board

Shawn Kingsberry is the Chief Information Officer for the Recovery Accountability and Transparency Board (Recovery Board).  His role as the Recovery Board’s CIO includes serving as a senior advisor to senior executives on all information technology matters, including investments and emerging initiatives and to assist in the coordination of all information technology day-to-day operations and activities. In February of 2011, he received the prestigious FED 100 Award for his leadership and persistence in implementing four information systems for the Recovery Board, including Federal Reporting.gov and Recovery.gov.

He has over 18 years of experience with the federal government leading organizations and implementing change across numerous federal agencies.  His expertise has also led to enhanced operational performance of numerous programs.

Before joining the Recovery Board, Mr. Kingsberry managed information technology programs at agencies within the U.S. Department of Agriculture (USDA), including the Forest Service and Rural Development.

Marina Martin, Head, Education Data Initiative, Education Department

Marina Martin is currently serving as Senior Advisor to US CTO Todd Park, focusing on liberating government data to fuel innovation, create jobs, and improve the lives of everyday Americans. Marina is also the Entrepreneur-in-Residence at the US Department of Education, where she works with internal and private sector innovators to use open educational data to power apps and services that can help achieve national priorities such as increasing college affordability and completion rates. Prior to joining the federal government, Marina ran the business efficiency consulting firm The Type-A Way, authored the book "Business Efficiency for Dummies," and worked on open data initiatives across Washington State, including organizing the first-ever Startup Weekend GOV.

Gary Newgaard, Director of Federal Solutions, EMC Isilon

Gary Newgaard is the Director of Federal Solutions at EMC Isilon.   Prior to joining EMC Isilon, Gary held senior positions at ASRC Management, Paragon Systems, and Compaq/HP . Gary earned his Bachelors of Science in Business Administration with a major in Marketing from the University of Tennessee.  In addition to his university degree, Gary went on to the Executive Education Program at Harvard Business School.  Mr. Newgaard sits as a board member on various industry councils and is an entrepreneurial, results-driven leader with a strong and solid history of driving IT/high-tech market share, building revenue and profits, and bringing customer satisfaction to new heights. Gary resides in Northern VA and works out of the McLean, VA EMC office.

Slides

PDF

Slide 1 Title Government Big Data What's Next

GovLoop03212013Slide1.png

Slide 2 Today's Speakers

GovLoop03212013Slide2.png

Slide 3 Housekeeping

GovLoop03212013Slide3.png

Slide 4 Government Big Data What's Next

GovLoop03212013Slide4.gif

Slide 5 Open Data Anonymized, Aggregated

GovLoop03212013Slide5.gif

Slide 6 My Data Personal Smart Disclosure

GovLoop03212013Slide6.gif

Slide 7 Open Data Overview

GovLoop03212013Slide7.gif

Slide 8 Machine Readable CSV & API

GovLoop03212013Slide8.gif

Slide 9 Human Readable

GovLoop03212013Slide9.gif

Slide 10 Standardized Common Descriptive MetadataGovLoop03212013Slide10.gif

Slide 11 FederatedGovLoop03212013Slide11.gif

Slide 12 My Data Overview

GovLoop03212013Slide12.gif

Slide 13 Blue Button

GovLoop03212013Slide13.gif

Slide 14 Green Button

GovLoop03212013Slide14.gif

Slide 15 My Data in Education

GovLoop03212013Slide15.gif

Slide 16 RATB Cloud Services High Level Technical Briefing

GovLoop03212013Slide16.gif

Slide 17 RATB Cloud Service Logical Architecture

GovLoop03212013Slide17.gif

Slide 18 RATB Logical System Design

GovLoop03212013Slide18.gif

Slide 19 RATB High Level Technologies

GovLoop03212013Slide19.gif

Slide 20 RATB Enterprise Architecture of the Future

GovLoop03212013Slide20.gif

Slide 21 Data Governance

GovLoop03212013Slide21.gif

Slide 22 Advanced Analytics Cloud 1

http://Federalaccountability.gov

GovLoop03212013Slide22.gif

Slide 23 Advanced Analytics Cloud 2GovLoop03212013Slide23.gif

Slide 24 Cloud Hub Categorization

GovLoop03212013Slide24.gif

Slide 25 RATB Cloud Service Websites 1

http://Recovery.gov

GovLoop03212013Slide25.gif

Slide 26 RATB Cloud Service Websites 2

http://Educationjobsfund.gov

GovLoop03212013Slide26.gif

Slide 27 RATB Cloud Service Websites 3

http://Federaltransparency.gov

GovLoop03212013Slide27.gif

Slide 28 RATB Cloud Service Websites 4

http://Federalaccountability.gov

GovLoop03212013Slide28.gif

Slide 29 RATB Contact Information

GovLoop03212013Slide29.gif

Slide 30 EMC Isilon Scale Out NAS

GovLoop03212013Slide30.gif

Slide 31 Agenda

GovLoop03212013Slide31.gif

Slide 32 What Is Big Data?

GovLoop03212013Slide32.gif

Slide 33 The Big Data Challenge

GovLoop03212013Slide33.gif

Slide 34 CIOs Turning to Scale-Out to Deal with Massive File-Data Growth

GovLoop03212013Slide34.gif

Slide 35 Big Data Apps Need Big Data Storage

GovLoop03212013Slide35.gif

Slide 36 Big Data Projects in the Federal Sector

GovLoop03212013Slide36.gif

Slide 37 Examples of Federal Sector Big Data

GovLoop03212013Slide37.gif

Slide 38 Sample of Government Accounts Using Isilon's Unified Scale-Out Storage...and Whty?

GovLoop03212013Slide38.gif

Slide 39 EMC Isilon Growing Momentum Healthcare and Life Sciences

GovLoop03212013Slide39.gif

Slide 40 The EMC Isilon Difference

GovLoop03212013Slide40.gif

Slide 41 "Never Refresh Again" Architecture

GovLoop03212013Slide41.gif

Slide 42 The Cost Advantage of Isilon 1

GovLoop03212013Slide42.gif

Slide 43 The Cost Advantage of Isilon 2

GovLoop03212013Slide43.gif

Slide 44 Isilon Scale-Out NAS Architecture

GovLoop03212013Slide44.gif

Slide 45 Largest and Most Scalable File System

GovLoop03212013Slide45.gif

Slide 46 Markets and Solutions

GovLoop03212013Slide46.gif

Slide 47 Questions ?

GovLoop03212013Slide47.gif

Unlocking the Power of Government Analytics

Source: http://www.govloop.com/profiles/blog...ment-analytics (PDF)

My Question: Why are the GovLoop documents in PDF and where is the actual big data in this?

Cover Page

AnalyticGuide2013FrontCoverPage.png

About GovLoop

GovLoop’s mission is to connect government to improve government. We aim to inspire public sector professionals by acting as the knowledge network for government. The GovLoop community has over 60,000 members working to foster collaboration, solve problems and share resources across government.
 
The GovLoop community has been widely recognized across multiple sectors. GovLoop members come from across the public sector. Our membership includes federal, state, and local public servants, industry experts and professionals grounded in academic research. To-day, GovLoop is the leading site for addressing public sector issues.
 
GovLoop works with top industry partners to provide resources and tools to the government community. GovLoop has developed a variety
of guides, infographics, online training and educational events, all to help public sector professionals become more efficient Civil Servants.
 
GovLoop’s 2012 Technology in Review Report is sponsored by Cisco, Google, GovDelivery, HP, IBM, Oracle and Microsoft.

Executive Summary

GovLoop’s report, Unlocking the Power of Government Analytics, will provide an overview of government analytics, identify leading case studies, and integrate the results of a recent GovLoop survey from 280 members of the GovLoop community. Further, this report will
provide ten steps to leverage analytics within a government agency. This report includes expert interviews from:
 
  • John Landwehr, Vice President of Digital Government Solutions, Adobe
  • Nils Engel, Solutions Engineer, Adobe
  • Karen Danczak-Lyons, Library Director, City of Evanston
  • Beth Blauer, former Maryland StateStat Director
  • Staff Members, Centers for Medicaid and Medicare Services
 
The case studies highlighted in this report identify strategies for agencies to identify cost saving measures, improve performance and efficiency, and develop new strategies to enhance customer service through analytics. Undoubtedly, analytics is transforming the way government operates and delivers services to customers. At all levels of government, agencies are now challenged to reduce costs, improve efficiency, and implement innovative measures to meet the complex needs of an agency.
 
On a day-to-day basis, government agencies collect, create, store and manage large volumes of data. Whether the data is from transactions online, visits to web pages, interactions on social media, government agencies are creating enormous volumes of data daily. Likewise, the data collected is either in structured and unstructured formats, which makes extracting knowledge a challenge. In some cases, decision makers do not have access or the right data infrastructure to support improved decision-making through analytics.
 
In a survey of the GovLoop community, participants were asked if they are using analytics in their agency. Overwhelmingly, agencies reported they are indeed using analytics in some capacity. In the survey, 57 percent of respondents stated, “Yes - but a very basic level,” 12 percent stated that they rely on robust analytics, 12 percent also stated they are currently exploring how to best use analytics, and 19 percent stated they are not using analytics yet, but would like to learn more. From our survey, it is clear that agencies understand the power of government analytics, and that there is enormous potential in using analytics to reduce costs, increase efficiencies and improve
service delivery.
 
At GovLoop, our mission is to “connect government to improve government.” Consistent with our mission, this report provides improved awareness to the power of government analytics to help transform and modernize government agencies. With any questions about this report, please reach out to Pat Fiorenza, Senior Research Analyst, GovLoop.

Are you using analytics in your agency?

AnalyticGuide2013Figure1.png

Unlocking the Power of Government Analytics

Expert Insights: Three Steps to Implement Analytics

In an environment where agencies are looking to streamline activities, maximize efficiency and significantly decrease waste, fraud and abuse, a comprehensive analytics solution should be considered by government agencies. John Landwehr, Vice President of Digital Government Solutions and Nils Engel, Solutions Engineer of Adobe provides expert insights how to implement an analytics program in the
public sector.
 
With the volume, variety, and velocity in which data is collected by government agencies, analytics software helps “organizations better manage their systems by measuring the efficiency, usage, trends, interactions, and the correlations of people, places, and things within their organization,” states Nils Engel, Solutions Engineer at Adobe in a recent interview with GovLoop.
 
In our interview, Landwehr, described a few of the activities Adobe’s software completes for its clients:
 
“Adobe provides solutions that are both cloud-based as well as behind your firewall. So we can provide anything from standard web reporting, what people are doing on your website, how they get there, are they able to get to the end goal of the website, and all the way to very advanced analytics on the web data. Adobe can also see if people are trying to hack into your website, or doing things that they shouldn’t be on your website, and then are able to understand specifically where those attacks are coming from.”
 
To address the need to serve agencies with both low and high security requirements, Adobe has created differing programs to provide varying levels of services to a myriad of clients. These programs can exist on the cloud or behind a firewall. Such software can serve both small companies that are curious about the success of an application, or agencies in the intelligence community that desire to use complicated analytics but need tighter security.
 
The variety of activities that analytics software can complete ranges from complicated security analysis to simple data interpretation. Through analytics, programs can measure the potential lifespan of product popularity or the actual lifespan of a piece of high tech machinery. In today’s fiscal climate, data analysis is used to justify funding, to “provide proof of value,” states Engel. The data collected
as evidence can be:
 
• How often a document/application is reviewed or used
• If content by specific creators is used more than others
• Who has access to what document/application at what time
 
Instead of basing decisions and conversations on emotional or anecdotal evidence, agencies can use the data collected to provide concrete proof of whether applications/programs should be continued or eliminated. Processes such as these minimize waste and streamline budgeting debates.
 
According to Engel, one of the major benefits of increasing analytics in government is being able to “understand the impact and effectiveness of websites and mobile applications.” Agencies do not want to be left behind in the digital era, and by creating workable and usable online sources, they can better serve the public. However, the issue is not convincing agencies of the importance of creating an analytics program.
 
The challenge is informing agencies on the best practices and strategies to implement one. Adobe provided some insights as to ways an agency of any size and budget could begin to create an analytics program.
How to Implement an Analytics Program
Step One
Identify and Collect Data
Engel states, “It’s all about data, so you need to have access to data in order to do the analysis.” To start an analytics program, an agency must, of course, have data. The first step is to identify a data need and then to collect the data to support agency needs.
Step Two
Create a Plan
When starting an analytics program it is important to create a strategy. “Putting together a methodology that will collect the appropriate data about the events that are occurring so the analysis can be done” will ensure that the program will continue instead of hitting a wall soon after implementation.
Step Three
Get Managers on Board
Analytics needs to be implemented from a “topdown approach.” Managers need to be leaders in not only starting, but also continuing to support the program. With proper leadership, agencies can understand the goal of an analytics program and can ensure that employee’s responsibilities work towards analytics goals.
Adobe & Digital Government
AnalyticGuide2013Figure2.png
 
Effectively Measure interaction and Enagement.
Drive Adoption of Citizen Self Service.
Improve Agency Performance.
Reduce Administrative Costs.
 
Adobe Analytics delivers the complete picture of digital citizen engagement that helps you track, analyze and optimize information services and citizen-agency engagement across multiple channels-including walk-up, call center, mail, online and mobile. Optimizing your agency’s online presence with Adobe SiteCatalyst while visualizing invaluable knowledge of historical website interactions with Adobe Insight will transform your agency’s online experience and reduce the opportunity for fraud. Adobe Analytics Provides Actionable Intelligence for Web, Social, and Big Data.

Overview of Government Analytics

With the explosion of data in the public sector, government agencies are now challenged to identify ways to unlock the power of government analytics. Agencies are exploring ways to manage and store large volumes of data collected on a day-to-day basis. Whether it involves  transactions online, visits to web pages, or interactions on social media, government agencies are creating enormous volumes of data daily.
 
Ami Wazlawik, a former GovLoop Research Fellow, writes on GovLoop that analytics is “the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions.” (i)
 
In many cases, analytics has improved services, increased transparency, and provided increased accountability to government programs. Wazlawik continues:
 
“Given the reality of budget constraints and a seemingly endless supply of data, the advantages of using analytics are clear. A fully developed analytics system, however, does not simply appear overnight. Agencies need mon-ey, staff, appropriate data and technology, supportive leadership, and a goal or target, to really make the best use of analytics. Leaders need to work to integrate analytics into the work and culture of their respective agencies, and need to encourage cooperation between agencies and partnerships with outside organizations, when needed.”
 
Wazlawik could not be more accurate in her assessment.
 
To further reinforce Wazlawik’s observation and to ground it in feedback from the GovLoop community, consider some of the results from a survey GovLoop conducted on analytics, which elicited 280 responses. In that survey, participants were asked to define the benefits of analytics. The results of two questions from the survey can be found below.

What are the benefits of analytics in government?

AnalyticGuide2013Figure3.png

Survey participants were asked to define the benefits of adopting an analytics initiative. Participants could select all answers that applied from among transparency and accountability; improved efficiency and productivity; control waste, fraud and abuse; manages resources, budgets and controls costs; and drives improved decision-making. Respondents selected “drives improved decision-making,” as the core benefit of analytics (68 percent). Respondents also stated other benefits of analytics, such as:
 
Find the story lines you want to tell. See if data will speak to that,”
 
“Analytics helps tell the story of why investments in website are important. Their use helps define ROI and makes a case for web investments,”
 
“Decisions made from numerical data rather than gut feel,” and,
 
“Better allocation of human resources - time, energy, talent.”
 
On the other hand, four challenges for analytics also emerged from participant responses to the GovLoop survey. The challenges were the volume, variety, velocity, value of the data collected. These findings are consistent with current trends around big data, analytics and data use in the federal government. Our findings reinforce the need for improved education on how agencies can best leverage data and analytics within their agency.

What are some of the challenges for government analytics?

AnalyticGuide2013Figure4.png

Volume of data
The biggest challenge with government analytics is the ”associated costs and time required to make the right decisions on what, when, who, how, in regards to collecting, managing and using the data,” said one GovLoop survey participant.
 
One of the core challenges for government employees is understanding the sheer number of data points that are created by the agency. In a recent white paper, Accelerating Government Intelligence to the Speed of Thought, Adobe writes:
 
“The sheer magnitude of the data that needs to be combed is staggering. In today’s digital world, individuals, corporations, organizations, and governments all generate huge volumes of data. Every phone call, credit card transaction, and visit to a website creates a data point that is collected and stored in a different system. This translates to massive datasets for analysts to access and manipulate.”
 
As this report has already indicated, the challenge for government employees is how to best map data to solve organizational challenges. With large volumes of data, it is imperative that agencies identify the right data to solve the problem they have addressed.
Variety
A second challenge is that data comes in a variety of forms, and standardizing information can potentially be a large undertaking for government agencies. The Adobe report further states:
 
“Data is in a multitude of disparate data types. Transactional data that provides information about events, such as credit card purchases, the details of a cell phone call, or an airline flight booking. There is also streaming data, or a continuous flow of log-type information that follows movement, such as a web log that streams information on what a user is doing—every page visited, every click, every item viewed, and so forth. Lastly, unstructured data is constantly being generated such as blogs, comments on articles, discussion forums, email, and other socially driven communication and commentary. Combining these disparate data types into one common environment for analysis is a daunting task.” 
 
Although this is a daunting task for government agencies, standardizing this data is imperative to leverage government analytics. Part of the challenge also involves working collaboratively across an agency to provide access to data that will be critical to the analytics program.
 
One respondent to the GovLoop survey confirmed this challenge, indicating that one of her top hurdles was, “Data entered multiple times in multiple places. Also, data validation is a huge issue as is culture that lives and dies by complicated and complex spreadsheets. In addition, benchmarking is difficult when exceptions are made for every 1 special case.”
Velocity
Data can change the minute or second it is captured. With the speed at which data is collected and moves, an analyst is challenged to keep up with rapidly-changing data sets. This is where an analytics solution can be imperative to the success of an analytics program, as an analytics solution can be seamlessly integrated to the decision making process to track data and make data-based decisions.
Identifying value
For government agencies, one constant challenge is identifying what data is high value, and what data may just be clutter. One survey respondent described this challenge as, “Parsing data to determine what applies to my program versus the overall agency.” Some data might be more valuable than other data for addressing challenges within an agency. For government leaders, they need to clearly articulate which information will allow them to make the soundest data-based decision.

Data Driven Decisions: How Analytics Transforms Services

A case study from Evanston Public Library

"Print is shrinking and trending more towards online databases and websites, so we are looking at usage patterns, which help guide
investment decisions.”
 
Karen Danczak-Lyons,
Library Director, Evanston, Illinois,
Public Library
 
Public libraries continue to be an important gathering place for communities. Libraries serve as meeting locations, provide access to the Internet, and function as a knowledge hub for people to research information.
 
Libraries across the country are undergoing a transformation in how services are delivered, and libraries are rapidly changing their service delivery models. As people have become increasingly reliant on technology, libraries have been adapting to the new ways people consume information. In order to keep the pace of technology advancements, many public libraries are using analytics, and using data to modernize services and offerings to patrons. GovLoop recently spoke with Karen Danczak-Lyons, Library Director of the Evanston, Illinois, Public Library. The interview highlights how the City of Evanston Library has used data to transform library services, and update systems to keep pace with service demands from patrons. Danczak-Lyons states, “Traditionally, especially at public libraries, we have looked at circulation as indicators of usage, but we are really broadening as our service delivery models have changed.”
 
Many of the changes Danczak-Lyons highlights in the interview were also reflected in a recent Pew study published in January of 2013, Library Services in the Digital Age. The study shows how library patrons desire an expansion of digital services, and continuation of print services. Danczak-Lyons expressed a very similar sentiment as the Pew study, stating:
 
“Print is shrinking and trending more towards online databases and websites, so we are looking at usage patterns, which help guide investment decisions. Not that we will ever get rid of print resources, but depending on the quality and depth of the resources of the database, that’s something we look at strategically and shift our resources more and more into that area.”
 
Clearly, libraries are transitioning how services are delivered. Much of the transformation is due to how the internet has dramatically changed how people access information. As libraries adjust their services, they are challenged to continue to provide traditional library resources. The Pew Report states, “The availability of free computers and internet access now rivals book lending and reference expertise as a vital service of libraries. Further, the report finds that 77 percent of people say free access to computers is a “very important” service provided by libraries.
 
In Evanston, as is true across the country, the public library is one of the only spots people can receive free internet access 
Danczak-Lyons states:
 
“For many parts of our patron base, we are the only internet access they have. With so many people losing their jobs, and can no longer use the internet through work, or they can no longer afford it at home, we are creating the second digital divide, so people that have used the internet and value it, but no longer can afford it are now returning to public libraries more and more.”
 
Other data that Evanston Public Library collects includes internet use data, foot traffic, database use, and website analytics. All of this data is used to improve the quality, and kinds of services that are collected.
 
An analytics example that Danczak-Lyons identifies is that the library tracks information around desktop computers. The library looks at wait time and demand for public access to a computer. Knowing this information, staff can decide what kind of investments to make for internet use, whether that is decisions around installing new computers or providing more bandwidth for internet access. “We track not only internet sessions used, but the demand and wait time for computers guides us into decisions of where and what kind of computers to install,” states Danczak-Lyons.
3 Lessons Learned from Evanston Public Library

AnalyticGuide2013Figure5.png

The Evanston public library is an excellent case study as to how using analytics and data can help improve decisionmaking. The conversation with Danczak-Lyons offered many best practices, lessons learned and tips for those invested in an analytics program. Below you will find the top three lessons learned from the Evanston Public Library.
1. Take Emotion Out of Decision Making
“Analytics helps take emotion out of the decision making. When resources are scarce or limited and the need is great, finding a fact based way to make decisions can help move programming forward,” states Danczak-Lyons.
 
No matter how large or small the decision, clarity when making a decision is essential. This does not mean that decisions are made in a robotic fashion. In fact, the calmness and clarity of a leader while making decisions shows the ability to manage emotions, and rationalize each decision. It’s an imperative skill for leaders, who are constantly learning how to manage emotions, in order to make the right decision for their organizations. When articulating a position and explaining a decision, it’s not just taking into consideration hard facts; it’s acknowledging and empathizing with the very human element of decision making. Analytics is essential to this process, and knowing that a decision is grounded in data can help decision makers arrive at a difficult and challenging decision.
2. Use Data to Define Customer Needs and Demands
“You need to be clear how you are defining and collecting these data points, and what is brought into an analysis to reach a conclusion,”  states Danczak-Lyons.
 
At the Evanston Public Library, clearly data has been collected and used to make informed decisions about how to best use data to improve services. Decisions around wireless data, desktop computers, research databases, all have been made due to advanced knowledge of how patrons are accessing information. “Our patrons have their own devices, not necessarily a computer, maybe a tablet or smartphone they want to use to access the internet, so we use this data to know if we need to invest in more bandwidth for internet,” states Danczak-Lyons.
3. Set Success Metrics
Critical to the success of a program is setting performance metrics and understanding how success is defined for an organization. As Lyons states, “Identifying at the front end of what you’re going to measure and how, and trying to describe what success looks like, so you know what success looks like when you see it.” This process is essential for managers. By defining success, agencies can set benchmarks, measure against past performance, and identify new methods to improve services.

Increasing Transparency Initiatives Through Analytics

A case study from Maryland StateStat

AnalyticGuide2013Figure6.png

 
Transparency is cited by public and private agencies as one of the major benefits of using analytics. While companies in the public and private sectors may ultimately have differing goals, their use of data to improve transparency can be very similar. Transparency can be described as providing the public with large amounts of data about a company or agency in order to represent its activities. This information can reflect spending, and potentially gain support and trust from the public. It can also benefit the agencies themselves. Maryland StateStat has harnessed the power of analytics to transform Baltimore.
 
In the late nineties, Martin O’Malley came to office as mayor of Baltimore. At the time the city had a high murder rate, poverty was rampant and trash lined the streets. Local federal employees would be absent at work and necessary tasks would go unfinished. New York City, which had experienced these problems a decade earlier, was now using a program called CitiStat to collect data and then analyze it to learn more about the crime and other issues that plagued the city in search of remedies. O’Malley decided to replicate New York CitiStat to address similar challenges in Baltimore. O’Malley wanted to run government like a business in the sense that government agencies would have a more concrete awareness of assets and functions. For example, O’Malley wanted to be able to know how many trash removal trucks Baltimore had and where they were scheduled to pick-up, so if trash was left on the side of the road, there was a clear path to holding someone accountable. On a larger scale, the Mayor wanted to cut down violent crimes and to work on sustainability projects, all of which he currently accomplished or is on track to accomplish. O’Malley was able to succeed because of the transparency that came from using analytics. This has since been used for the collecting and analyzing data for the rest of the state of Maryland.
 
To get more insight on the effects of analytics and the benefits of transparency in the public sector, GovLoop interviewed Blauer, a former employee at Maryland State Stat. According to Beth Blauer, there are three major advantages to using analytics to created transparency of local government.
5 Lessons Learned from Maryland State Stat Case Study
1. Know Your Agency
Blauer identifies, “getting your data all in one place provides an enormous opportunity to baseline and figure out where you are.” Analytics provides the opportunity for an agency to look at itself more objectively. Through analytics initiatives, agencies can reveal places where spending needs to decrease, how to improve communication across an organization and clearly define the quality of the services the agency provides.
2. Have Realistic Goals
Before analytics, government agencies had to rely on anecdotal evidence, limited data and the desires of leaders. With data being widely available, agencies employees and constituents can understand their capabilities and limitations of an agency and then created expectations and goals.
 
Blauer states, “The use of data and analytics from a baseline perspective helps agencies create more realistic strategic goals, and strategic planning in government helps you contextualize what you’re trying to do.” In the end, both the agency and the public will be more satisfied with project outcomes.
3. Hold People Accountable and Recognize the Importance of Leadership
Accountability also lies at the heart of the desire for increased transparency. President Obama stated, “Transparency promotes accountability and provides information for citizens about what their government is doing. Information maintained by the Federal Government is a national asset.”
 
Maryland StateStat, according to Blauer, “was not designed as a transparency initiative; we were a performance management initiative.” The goal initially was to improve productivity; however, by using analytics State Stat learned how to collaborate and communicate with all of their partners to provide a more transparent work process.
 
Through Blauer’s interview, we have learned about the benefits that come from harnessing data and using it to benefit the functions of a local government agency. As one of the first states to widely use analytics, Maryland StateStat has shown the value of increased transparency.
 
Blauer described leadership as the most important step in any analytics initiative. “Leadership is the first thing,” She said, “you need to have a cheerleader.” Leadership provides legitimacy in the efforts of government employees. If a visible leader spearheads or supports an initiative, then other employees and constituents will be more likely to support and trust the program.
4. Show a Clear Business Value

AnalyticGuide2013Figure7.png

Though analytics adoption is increasing, local governments and the public may still not understand its benefits. If an agency makes data and the results of the analytics available then this transparency will allow the public to see the agency’s analytics program value. On the Maryland State- Stat website, there is a section called “Open Data Portal” that provides statistics from everything from number of arrests to greenhouse gas emissions. By providing this information, citizens can know the state of their city and the efforts that their local government is making to improve their community.
5. Collect Right Data and Develop Baselines
When it comes to analytics, data is the most important aspect. This data needs to be information that will support the efforts of an agency. To determine the type of data an agency needs, the agency needs to determine what data it already has and what type of information will be needed, thus creating a baseline.
 
Blauer points out that “base lining helps you strategically figure out where you want to be, so you can start building your goals and not make decisions based on anecdotes or not based on evidence.” The data needs to be verified so that decisions are concrete and can be understood by employees and the public. In the beginning, says Blauer, many employees and constituents may question the validity of the data but “when you have an analytics tool that is constantly utilizing the data, over time your data is evolving. We were constantly testing the data and comparing it to other data, making sure we understood the data very well. We saw the quality increase over time.”

Fighting Waste, Fraud and Abuse through Analytics

"The ability to link providers through their social networks helps CMS and its law enforcement partners unravel the complex relationships among fraudulent providers and between providers and beneficiaries.”
 
Staff Members, CMS
Report to Congress

A case study from Centers for Medicare & Medicaid Services (CMS)

AnalyticGuide2013Figure8.png

GovLoop recently had the opportunity to speak with staff members from the Centers for Medicare and Medicaid Services (CMS). CMS staff provided expert insights on the importance of analytics to control waste, fraud and abuse. In 2010, CMS and the Department of Health and Human Services (HHS) launched an aggressive campaign to reduce fraud for medical claims.
 
To meet their goals of reducing fraud, CMS developed a new anti-fraud tool using predictive analytics and real-time data to spot abnormalities for fee-for-service claims. CMS has provided a thorough analysis of the Fraud Prevention System (FPS) in a recent report to Congress.
 
The FPS was required under the Small Business Jobs Act of 2010 (SBJA). In the report to Congress, CMS staff states:
 
“Since June 30, 2011, the FPS has run predictive algorithms and other sophisticated analytics nationwide against all Medicare fee-for-service (FFS) claims prior to payment. For the first time in the history of the program, CMS is systematically applying advanced analytics against Medicare FFS claims on a streaming, nationwide basis.”
 
The SBJA identifies specific functional requirements of the predictive analytics technologies for CMS. This report identifies the following requirements for CMS:
 
  • Provide a comprehensive view of Medicare FFS provider and beneficiary activities to identify and analyze provider networks, billing patterns, and beneficiary utilization patterns and identify and detect patterns that represent a high risk of fraudulent activity
  • Integrate fully with the Medicare FFS claims flow 
  • Analyze large datasets for unusual or suspicious patterns or anomalies before payment and prioritize suspicious activity
  • Capture outcome information to continually refine and enhance the system
  • Prevent payment of fraudulent claims.
 
The Fraud Prevention System (FPS) developed by CMS uses predictive analytics technology to identify and prevent medical fee-for-service (FFS) claims. Since June 2011, CMS has been using FPS to screen FFS claims. To build the system, CMS worked across sectors to learn best practices from the telecommunications and banking industry.
 
The report to Congress identifies three analytic strategies that CMS has implemented: anomaly detection models, predictive models, and social network analysis.
 
In our interview, CMS staff identified that through robust analytics adoption, the agency is able to improve the allocation of investigative resources and become more efficient identifying and finding fraudulent claims.
Anomaly Detection Models
The first model that is used by CMS is the anomaly detection model, which is a sophisticated model that defines thresholds of acceptable behavior. This model compares an individual providers behavior patterns and contrasts with that of a peer group. The report states,
 
“Certain behaviors and characteristics that indicate potential fraud may also be indications of acceptable behavior. For example, if a provider bills for many more services than are normally performed by similar providers in a defined time period, the FPS can alert an investigator to inspect the claim prior to payment.”
Predictive Analytics
CMS has also developed rigorous predictive analytics models to spot and identify fraudulent claims. This model uses data collected from previous fraud cases to help predict future instances of fraud while allowing CMS to investigate suspicious complaints. The CMS report states, “Developing predictive models requires advanced analysis because a fraudulent claim may become apparent only when factors are considered in combination; whereas independently, those factors may not be suspicious.”
Social Network Analysis
Social network analysis models are also used by CMS to identify links to fraudulent complaints. “The ability to link providers through their social networks helps CMS and its law enforcement partners unravel the complex relationships among fraudulent providers and between providers and beneficiaries,” states CMS.
 
The FPS has saved CMS millions of dollars, and allowed CMS to identify claims and patterns of behavior. This process allows them to understand trends, and since information is occurring in real-time, FPS is able to immediately spot fraud, saving the time of investigators and CMS staff.
 
CMS staff mentioned that for those getting started with analytics, it is essential to clearly define the problem that is being solved, be sure the right data is collected, and identify resource needs for an analytics project.

Ten Steps to Leveraging Analytics in the Public Sector

"It may cost and take time to run through scenarios/do a feasibility study before installation, testing, and implementation, but it’s worth the trouble.”
GovLoop Survey Participant
 
"Benchmarking as another data point to judge success; informed decision making.”
GovLoop Survey Participant
 
"Ihave learned the best analyses are best completed by those who have a clear understanding of the mission of the agency and defined goals of the websites. Additionally, there must be a plan for implementation based on analytics gathered even with the current state of the budget.”
GovLoop Survey Participant
 
"Building an analytical culture, where data is widely used to evaluate deductions, is critical to becoming an analytically competitive organization.”
GovLoop Survey Participant
 
Whether you are looking to implement predictive analytics, or to use your web data in a more compelling way, this report highlighted  common challenges, case studies, and strategies to get you started. This report concludes with ten steps to truly leverage analytics in the public sector.
AnalyticGuide2013Figure9.png

1. Collaborate with peers and work across silos to share data and resources

to people with similar challenges and have already worked through challenges while adopting analytics programs. Peers within an agency can serve as an imperative resource for the success of any analytics program.
 
Collaboration is also essential for analytics adoption. Agencies will have to work collaboratively across departments to share information and resources. This will also build trust, improve communications and help to develop a culture of analytics across the agency.

2. Strong leadership needed

Leadership needs to be in support of the analytics program, provide adequate resources and clearly chart a course for an analytics initiative within the agency. Without leadership, analytics initiatives will not be able to fully leverage all the potential of making data driven decisions. Leaders should encourage analytics programs within an agency, and work to craft a culture that makes data based decisions.

3. Clear vision of how to use analytics to solve a problem

For analytics programs to be truly successful, the program needs to be clearly mapped towards an organizational challenge or deficiency that can be solved with proper data analysis.
 
The importance of having a clear vision became extremely clear throughout our interviews and survey. One survey respondents said, “I have learned the best analyses are best completed by those who have a clear understanding of the mission of the agency and defined goals of the websites. Additionally, there must be a plan for implementation based on analytics gathered even with the current state of the budget.

4. Identify the right data

Start by understanding what data is needed to solve the problem, and know where to access, share and store this information. This process is critical, and agencies need to know that the data they have is the right data for the problem they are trying to solve. Once the right data has been identified, agencies can begin to benchmark data and measure against past performance for any indicators they define. As one survey participant identified, “Benchmarking as another data point to judge success; informed decision making.”

5. Address challenges for implementation

Analytics presents a lot of challenges. Be sure to map these out front and think strategically how to solve and work through challenges. One strategy is to engage core stakeholders up front, and continue to integrate them throughout the process. This means clearly defining impact on related programs, which employees will be directly involved, and strategies needed.

6. Assess workforce and build analytics support team

Training for analytics is essential. Once an analytics program has kicked off and started, everyone on the team should know how to use the tools and access the information. Analytics is still emerging within the public sector, and although there is a clear value for analytics, it can be faced with resistance. Like any new initiative that is disruptive to an agency, innovators must elicit support from their peers and senior officials.
 
Although budgets may not be flexible enough to allow it, typically it makes sense to have an analytics group within the agency. A small, focused and passionate team is necessary for analytics. Team members should feel empowered to work in high functioning and collaborative environments. With a passionate team, analytics can move to broader adoption at the agency.

7. Pilot – start small and build quick wins

“It may cost and take time to run through scenarios/do a feasibility study before installation, testing, and implementation, but it’s worth the trouble,” states a survey participant.
 
Like many IT initiatives, analytics programs will take some time, and agencies should focus on small iterations to improve. Success will not occur overnight. If agency leaders can start by having a small pilot program and gaining a few quick wins to show value of analytics, the program will have more support and success with broader analytics adoption. By starting small, many challenges can be addressed with more attention, and can ease implementation efforts as the analytics programs scales to the rest of the agency.

8. Choose the proper reporting techniques

This report touched on a few of the report techniques that can be used for government agencies. They need to be sure that the techniques
match the needs of the organization. In many instances, agencies are collecting various kinds of data across the agency, combining data from different sources and running advanced analytics programs. Clearly, to streamline information, agencies need to set reporting strategies for different channels and resources. It is the responsibility of agencies to present a coherent reporting strategy to its employees
and constituents that cater to a specific analytics need.

9. Map to strategic needs at agency

It’s imperative that the agency map out the strategic goals of analytics and what problem they are trying to solve. Once that foundation is set, agencies can work to identify the correct data streams, the format of the data, and the tools and resources needed to achieve the end goal. It’s all about, as stated by Engel, “making sure that the application, the types of events, that the source of the data that you want to analyze
have the appropriate data to enable you to ask the questions that you want to answer.” This process is at the core of any analytics program for government.

10. Understand any cultural shifts that may take place

“Building an analytical culture, where data is widely used to evaluate deductions, is critical to becoming an analytically competitive organization,” states a GovLoop survey participant.
 
With analytics, there may be changes to culture, and leaders will potentially be faced with organizational and cultural challenges. These challenges are common in any kind of technology adoption. Leaders must be aware that these challenges will be down the road. Imperative to the success of an analytics program is for a leader to show the underlying organizational value of analytics and how it will help guide the agency towards its goals. Further, a leader must be sensitive to any changes to the day –today changes for employees. In addition, agencies must be aware of wider culture shifts, especially in the realm of technology.

5 Questions to Ask for Analytics Initiatives

"This question is also particularly important, because it will indicate how data has improved a service. With clear metrics set, and a thorough analysis of data, agencies can calculate their return on investment for analytics.
 
Emily Jarvis,
GovLoop Online Producer
AnalyticGuide2013Figure10.png
 
Whether it is bringing challenges to leadership, identifying value, or understanding costs, agencies today are exploring how to understand how to truly leverage the power of analytics. As many government websites are becoming increasingly transactional and services move to self-service platforms, it’s becoming clear that with the volume of data that government collects, creates, stores and manages, can be used to transform how services are delivered. For government leaders looking to implement an analytics program, five basic questions come to mind to assist in the decision making process:
  • Do employees have access to the right information?
  • What problems are we trying to solve or what service needs improvement?
  • How do agencies connect the dots and access other data?
  • What new value has been created for customers and what’s the return on investment?
  • How do we show that value to leadership and change culture with data?
 
Although these questions are important, one of the first steps to unlocking the power of analytics is basic data discovery to know what format data is, how to access data, verifying authenticity and beginning to baseline information. As data can be structured (relational database), semi-structured (think XML and email) and unstructured (not predefined, doesn’t fit well into relational databases), it is important to know and learn what kind of format data is in.

Do employees have access to the right information?

Understanding where data rests, the format of the data and who has access is essential to any analytics initiative. Once the decision is made to invest in an analytics strategy, agencies must locate and ensure that they are actually collecting the right data to solve an organizational problem. For instance, if an agency is going to start recruiting employees due to members of the workforce retiring, agencies need to locate data that shows retirement trends, expected retirement dates, and the start to craft their strategy around available information, or survey and collect the right information.

What problems are we trying to solve or what service needs improvement?

With the large volume of data that is created and stored, agencies must take a laser-like focus to solving problems with data use. If the end goal is to increase self-service through a new online portal, goals and metrics should be defined that clearly map to the over arching goal.

How do agencies connect the dots and access other data?

Once data is collected, stored and understood, it is important that information is shared, securely and safely. If information is confidential or sensitive information, then this information likely should not be shared. But, if information is not very confidential, like web information or page visits, then information should be shared to peers to help them see impact of any program they are running.

What new value has been created for customers and what’s the return on investment?

AnalyticGuide2013Figure11.png

A great case study of using customer insights and data to improve government services comes from the City of Santa Cruz. Emily Jarvis, GovLoop’s Online Producer shared the story on the DorobekINSIDER. You can check out the story by following the links below, Jarvis writes:
 
“The City of Santa Cruz is the smallest community to ever partner with Code for America, but it had one of the largest problems to solve: how to make it easier to take an idea for a small business from conception to reality. They created an online permitting portal OpenCounter. The portal launched last Wednesday January 9, after an intense year of development, testing, and refinement.”
 
This question is also particularly important, because it will indicate how data has improved a service. With clear metrics set, and a thorough analysis of data, agencies can calculate their return on investment for analytics.

How do we show that value to leadership and change culture with data?

Recently GovLoop’s Research Fellow, Kate Long, wrote a great post identifying that leadership is a missing piece to analytics. Be sure to check out her post, as she does a great job outlining some of the challenges to leadership and analytics. For those leading government analytics programs, it is essential to continue to craft a culture of openness, placing a strong emphasis on program effectiveness and supporting employee needs for technology.
 
Starting, implementing, reforming or ending a program in government is no easy task, but the right decision can be made by sound data analysis. As government continues to develop more data and becomes increasingly complex, analytics is going to play an essential role in transforming how government operates.

Final Thoughts & Acknowledgements

GovLoop’s mission is to connect government to improve government. We aim to inspire public sector professionals by acting as the knowledge network for government. The GovLoop community has over 60,000 members working to foster collaboration, solve problems and share resources across government.
 
The GovLoop community has been widely recognized across multiple sectors. GovLoop members come from across the public sector. Our membership includes federal, state, and local public servants, industry experts and professionals grounded in academic research. Today, GovLoop is the leading site for addressing public sector issues.
 
GovLoop works with top industry partners to provide resources and tools to the government community. GovLoop has developed a variety of guides, infographics, online training and educational events, all to help public sector professionals become more efficient civil servants.
 
The GovLoop team is thankful to all of those who contributed to the development of this report. We thank everyone for their participation in our survey, active community engagement, input and knowledge shared while developing this guide. This guide would not have been completed without your assistance and sharing of knowledge about government analytics. Special thanks to our experts interviewed for this report, and Adobe for their sponsorship of this report.
 
• John Landwehr, Vice President of Digital Government Solutions, Adobe
• Nils Engel, Solutions Engineer, Adobe
• Karen Danczak-Lyons, Library Director, City of Evanston
• Beth Blauer, former Maryland StateStat Director
• Staff Members, Centers for Medicaid and Medicare Services
 
GovLoop Authors
• Pat Fiorenza, GovLoop Senior Research Analyst
• Kate Long, GovLoop Research Fellow
• Jeff Ribeira, GovLoop Senior Interactive Designer
• Carolyn Moeger, GovLoop Design Fellow
• Andrew Krzmarzick, Director of Community Engagement
 
For more information on this report, please contact Pat Fiorenza, GovLoop Senior Research Analyst, at pat@govloop.com

GovLoop Resources

The following links are some of GovLoop’s best resources of 2012. Be sure to take a look at some of the blogs, webinars, guides and trainings and join the conversation.
 
DorobekINSIDER Live - Experts Weigh in on Tech Trends: GovLoop: The Knowledge Network for Government, Emily Jarvis. December 20, 2012.
 
Government and Analytics: A Brief Overview: GovLoop: The Knowledge Network for Government, Ami Wazlawik. December 10, 2012.
 
10 Benefits of Predictive Analytics: A Path to Improved Decisions: GovLoop: The Knowledge Network for Government, Pat Fiorenza. November 26, 2012.
 
IBM Report Highlights the Power of Predictive Analytics: GovLoop: The Knowledge Network for Government, Pat Fiorenza. September 25, 2012.
 
3 Benefits of Location Analytics for the Public Sector: GovLoop: The Knowledge Network for Government, Pat Fiorenza. February 25, 2013.
 
Improving Accountability & Making Data Driven Decisions - Analytics in 2012: GovLoop: The Knowledge Network for Government, Pat Fiorenza. December 26, 2012.
 
Newest Federal Jobs: Data Scientists: GovLoop: The Knowledge Network for Government, Emily Jarvis. January 3, 2013.
 
Is Leadership the Missing Link for Analytics?: GovLoop: The Knowledge Network for Government, Kate Long. January 17, 2013.
 
Measuring the Unobserved: GovLoop: The Knowledge Network for Government, John Kamensky. October 29, 2012.
 
Interview: Beth Blauer, former employee, Maryland StateStat. January 17, 2013.
 
Interview: John Landwehr, Vice President of Digital Government Solutions, Adobe; Nils Engel, Solutions Engineer, Adobe. February 7, 2013.
 
Interview: Karen Danczak-Lyons, Library Director of the Evanston, Illinois, Public Library. January 3, 2013.
 
Interview: Staff, CMS. January 24, 2013.

Back Cover Page

AnalyticGuide2013BackCoverPage.png

NEXT

Page statistics
4809 view(s) and 72 edit(s)
Social share
Share this page?

Tags

This page has no custom tags.
This page has no classifications.

Comments

You must to post a comment.

Attachments