ASMC The Business of Defense - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Thu, 20 Jun 2024 18:07:09 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png ASMC The Business of Defense - Federal News Network https://federalnewsnetwork.com 32 32 CISA looks to set the example for data stewardship under ‘zero trust’ push https://federalnewsnetwork.com/cybersecurity/2024/06/cisa-looks-to-set-the-example-for-data-stewardship-under-zero-trust-push/ https://federalnewsnetwork.com/cybersecurity/2024/06/cisa-looks-to-set-the-example-for-data-stewardship-under-zero-trust-push/#respond Tue, 18 Jun 2024 19:33:59 +0000 https://federalnewsnetwork.com/?p=5045233 CISA is helping agencies advance data security, while ensuring it has its own data house in order.

The post CISA looks to set the example for data stewardship under ‘zero trust’ push first appeared on Federal News Network.

]]>

Under the ongoing federal “zero trust” push, data is often considered one of the most important but least mature area for federal agencies.

The Cybersecurity and Infrastructure Security Agency (CISA), which maintains the “zero trust maturity model” that serves as a roadmap for agencies, is also working to better understand, protect, and connect its cybersecurity data, according to Grant Dasher, architecture branch chief within the office of the technical director at CISA.

“Data is one of the areas of the zero trust transition that probably has gotten a little bit less attention, but that’s not because it’s not critically important,” Dasher said on Federal News Network. “We do think it’s critically important.”

Dasher said one of his big jobs is to help CISA’s cybersecurity teams gain an understanding of the agency’s internal data holdings. That work is critical to programs like Continuous Diagnostics and Mitigation (CDM), which provides cybersecurity services and dashboards to the entire federal civilian executive branch.

“We are applying strong security controls to the data that we steward, and making sure that we understand it and connect it between different parts of the mission, so that they can make effective use of it,” Dasher said.

CISA’s chief data stewards

To help address the data challenge, Dasher said CISA has identified “chief data stewards” who are responsible for managing specific datasets across the agency. Those responsibilities include identifying the metadata characteristics that are necessary to both share and protect the information in question.

“We think developing that understanding is critical, because then on top of that, you can put in place data governance controls,” Dasher said. “You can say, ‘Okay, well, this person is the data owner, or the data steward. And so this is the person who should be able to approve, for example, access requests to that data by other parts of the organization’.”

CISA’s zero trust support

Combining data access controls with strong identity governance is a key aspect of moving away from perimeter-based cybersecurity and toward a zero trust architecture.

Within the CDM program, CISA has made a major investment in Endpoint Detection and Response (EDR) tools that agencies are adopting as part of the zero trust push. Dasher said CISA has also helped some smaller agencies with identity security. And the cyber agency is also helping agencies adopt its Secure Cloud Business Applications (SCuBA) guidance.

Ultimately, though, Dasher said there’s no one-size-fits all solution to improving data security across federal agencies. But he said its key for agencies to embrace established best practices in cyber risk management.

“There’s a natural tension here between enabling access to support the mission and providing security,” Dasher said. “We can’t let security become something that prevents the government from delivering services to its to its constituents. But we have to protect the data. And so finding how to triangulate that is really the crux of data protection.”

The post CISA looks to set the example for data stewardship under ‘zero trust’ push first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/06/cisa-looks-to-set-the-example-for-data-stewardship-under-zero-trust-push/feed/ 0
Tennessee Valley Authority takes on NARA’s digitization mandate https://federalnewsnetwork.com/federal-insights/2024/06/tennessee-valley-authority-takes-on-naras-digitization-mandate/ https://federalnewsnetwork.com/federal-insights/2024/06/tennessee-valley-authority-takes-on-naras-digitization-mandate/#respond Tue, 18 Jun 2024 14:11:52 +0000 https://federalnewsnetwork.com/?p=5043663 Records are made up of a variety of formats. What were primarily reports on paper make up about 70% of what is generated at TVA.

The post Tennessee Valley Authority takes on NARA’s digitization mandate first appeared on Federal News Network.

]]>
Federal Insights - Records Management - 06/18/2024

After June 2024, the National Archives and Records Administration will no longer accept analog records into their collections. Agencies across the country are tasked with transitioning paper and physical records into digital formats. One of those agencies, the Tennessee Valley Authority, has already taken significant steps to meet those requirements.

The Tennessee Valley Authority, a federally owned electric utility corporation, provides services to Tennessee, along with portions of Alabama, Mississippi, Kentucky, Georgia, North Carolina and Virginia, a population of approximately 10 million customers. With that mission comes the responsibility to collect, maintain and provide public access to an enormous and ever-growing number of records and documents like annual and quarterly reports, bond offerings, notice of reservoir levels, maps, charts, models and other general information.

“Over the last five years, we’ve consistently generated around five million electronic records a year. That’s just under about 10 terabytes of records. Now overall, on any given day, we manage around 200 million temporary and permanent records that are in various stages of the records management lifecycle,” said Rebecca Coffey, Tennessee Valley Authority agency records officer and senior manager of enterprise records, on Federal Insights – Records Management.

Temporary and permanent records

The TVA, like most agencies, manages a collection of temporary and permanent records. The process for disposition of records is public to provide stakeholders continued access to where they are in the records management lifecyle.

“An agency will prepare a proposed schedule, send it to their designated archivist. Once we get an informal nod that this looks good and it aligns with other agencies, then those schedules get published in the Federal Register, and the public and other federal agencies have a chance to comment on them.” Coffey said

Records are made up of a variety of formats. What were primarily reports on paper make up about 70% of what is generated at TVA. The other 30%, considered mixed media, consists of text messages, emails, instant message chats, maps, photographs and other data streams that are generated as TVA conducts its work, like monitoring river levels.

“When they go to NARA for the final approval, they will decide the time frame. If the records have significant value, historical value, whether they are telling the story of TVA or the impact in the national story, the federal records will become permanent, which means that at a designated time, they will get sent to NARA. We will turn over ownership of those records to the National Archives,” Coffey said, on the Federal Drive with Tom Temin. “We obviously have different plans for how we’re going to manage those records, and it changes every year. With new technology comes new formats.”

Digital Transformation

“NARA is not focused so much on ‘have you digitized everything by this deadline,’ as much as it is ‘what is your plan,’ because obviously, across the federal agencies we don’t all have the resources to be able to implement it immediately,” Coffey said.

Agencies must also contend with digital records organization. Once transformed from paper to a digital entity, documents must also be encoded with metadata to make it searchable and easily retrieved when needed. NARA works as a partner for agencies in the digitization process. In the past, most records could be counted on to be a printout, but that has changed. Following directives from NARA and the Office of Management and Budget in 2011 and beyond, requiring an overhaul of records management, TVA has been operating under a series of initiatives.

“I’m really proud of the work our TVA team is doing. This has to do with some of our cultural resource records, our mapping and aerial photographs. We have a collection that goes back 100 years, and obviously those are not going to be the easiest things to digitize. We have our project: We are processing over a million frames of film and about 300,000 hard-copy maps. And what the team has done is they’ve thought about not just how we use them today, but how we could possibly use them in the future,”  Coffey said.

The post Tennessee Valley Authority takes on NARA’s digitization mandate first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/tennessee-valley-authority-takes-on-naras-digitization-mandate/feed/ 0
This vendor tested its AI solutions on itself https://federalnewsnetwork.com/federal-insights/2024/06/this-vendor-tested-its-ai-solutions-on-itself/ https://federalnewsnetwork.com/federal-insights/2024/06/this-vendor-tested-its-ai-solutions-on-itself/#respond Tue, 18 Jun 2024 14:08:54 +0000 https://federalnewsnetwork.com/?p=5033628 IBM provided its own grounds for testing and developing a set of AI tools. It can help client organizations avoid some of the initial mistakes.

The post This vendor tested its AI solutions on itself first appeared on Federal News Network.

]]>

As its own ‘client zero’, IBM identified its human resources function back in 2017 for transformation with artificial intelligence. Today, the function is fully automated, and IBM has a wealth of insights and learnings to share that they hope can help federal agencies avoid some of the same pitfalls.

IBM took an AI-driven approach to transforming its HR function. For its test bed, the company used itself and came away with valuable lessons learned.

Now IBM can help federal agencies apply those lessons and — hopefully — avoid some of the same mistakes. That’s according to Mike Chon, IBM’s vice president and senior partner of talent transformation for the U.S. federal market.

“IBM has gained the efficiencies, it’s delivered on the employee experience, it has achieved a lot of the automations [and] productivity gains,” Chon said.

He cited statistics that tell the story. IBM employees have had nearly two million HR conversations with a virtual agent. Those have achieved resolution in 94% of the cases, meaning the employee didn’t need to proceed to a conversation with a live person.

Manager productivity

When seeking HR efficiencies, organizations tend to think initially in terms of self-service for employees. But Chon urged IT and HR staffs to think more broadly to include managers too.

“I also want to emphasize manager self-service,” he said. “I think that’s where the additional value can come in.”

It also requires a bit of rewiring of manager habits. Chon said that initially, he, like many experienced managers, was less inclined to invoke a chatbot than to simply call his HR representative with questions.

“I myself did not really adopt that [AI] paradigm right away,” he said. “My muscle memory was to call an HR person. Clock forward to today … I actually tend to go to our AI chatbot more than an HR manager.”

He added, IBM managerial uptake of the HR chatbot has reached 96% worldwide, accounting for 93% of the transactions.

HR presents a natural entry point for AI because it touches everyone.

“By introducing AI through HR, you’re really having this ability to embed the use of these tools throughout your enterprise,” Chon said. “I think that really starts to get people more comfortable.”

Use case approach

Having chosen the HR function, Chon said, IBM initially tried an overly comprehensive approach.

“When we first started this journey, we tried to boil the ocean. It was this big bang approach,” Chon said.

The company realized almost immediately that the tool wasn’t quite right, and people weren’t embracing it.

Lesson learned?

“Never seek the silver bullet,” Chon said. “It really forced everyone to put the brakes on this process” and rethink their approach.

The rethinking resulted in what Chon called a building block, use case-by-use case approach. The team started by identifying specific high-frequency or highly repetitive tasks, the automation of which would allow the team to spend less time on routine tasks and more on strategic, value add work. Data connected to each task helped with this identification, which  ultimately allowed the team to identify two use cases: employee time off and proof-of-employment letters. Before AI, employees would ask their HR representative how many vacation days they had left, and it could take days for HR to prepare and send employee proof of employment letters, Chon said. These tasks represented some of the most repetitive and time consuming for the function.

“AI gave employees the ability to find out their vacation days in seconds and generate their own employee verification letter from anywhere, anytime. And they get instant satisfaction because it happens right in front of them,” Chon said.

In the employment verification letter  use case, AI took the form of robotic process automation, he added.

Moreover, if a particular step to a task doesn’t work, HR and IT could simply turn it off and improve it, without affecting everything else that’s working well.

It’s also important to understand that in a small percentage of cases, employees will need to interact with humans; no AI agent can do everything. Therefore, Chon said, “we always give people the ability to connect to a live agent.” Careful data analysis of what leads to “off-ramps” helps with continuous improvement of the AI tool, he said.

Ultimately, Chon said, the HR AI-driven self-service option for employees and managers lets HR professionals become more productive, taking the drudgery out of HR processes, leaving people more time for “tackling things like recruiting and other high value activities like talent development.”

Ultimately, the key lessons learned from IBM’s experience center on employing a use-case driven approach. AI is successfully adopted with small wins, building blocks and steps. Larger, more strategic and transformational use cases don’t have one clear answer or outcome. The key is finding a use case — a workflow, process or task — that could be accelerated or improved through automation. This also allows for easier scaling to other parts of the agency.

“Now, I would say, seven years later, each time the team launches a new use case, it’s actually getting better and better,” Chon said.

Listen to the full show:

The post This vendor tested its AI solutions on itself first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/this-vendor-tested-its-ai-solutions-on-itself/feed/ 0
Federal Executive Forum CTO’s Profiles in Excellence in Government 2024: Innovation and Emerging Technologies https://federalnewsnetwork.com/cme-event/federal-insights/federal-executive-forum-ctos-profiles-in-excellence-in-government-2024-innovation-and-emerging-technologies/ Tue, 18 Jun 2024 13:25:01 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=5044640 What technology initiatives have been successful and what are plans for the future?

The post Federal Executive Forum CTO’s Profiles in Excellence in Government 2024: Innovation and Emerging Technologies first appeared on Federal News Network.

]]>
Technology in government continues to change rapidly, and agencies must work closely with each other and private sector partners to drive innovation and success. What initiatives have been successful and what are plans for the future?

During this webinar, you will gain the unique perspective of top federal and industry technology experts:

  • David Larrimore, Chief Technology Officer, Department of Homeland Security
  • Kaschit Pandya, Chief Technology Officer, Internal Revenue Service
  • Doug Robertson, Chief Technology Officer, Small Business Administration
  • Christopher Wallace, Chief of Cybersecurity and Chief Technology Officer, Program Executive Office, Defense Healthcare Management Systems
  • Adam Clater, Chief Architect, North American Public Sector, Red Hat
  • Greg Carl, Principal Technologist, Pure Storage
  • Moderator: Luke McCormack, Host of the Federal Executive Forum

Panelists also will share lessons learned, challenges and solutions, and a vision for the future.

The post Federal Executive Forum CTO’s Profiles in Excellence in Government 2024: Innovation and Emerging Technologies first appeared on Federal News Network.

]]>
Expanding CISA’s zero trust role is smart: Here’s why https://federalnewsnetwork.com/federal-insights/2024/06/expanding-cisas-zero-trust-role-is-smart-heres-why/ https://federalnewsnetwork.com/federal-insights/2024/06/expanding-cisas-zero-trust-role-is-smart-heres-why/#respond Mon, 17 Jun 2024 17:48:12 +0000 https://federalnewsnetwork.com/?p=5043678 With further tasking and resources, CISA could supply more help to address major challenges that impede FCEB ZTA implementation.

The post Expanding CISA’s zero trust role is smart: Here’s why first appeared on Federal News Network.

]]>
This content was originally posted by Booz Allen Hamilton.

Picture this: The president is poised to deploy U.S. military forces to respond to a future geopolitical crisis. Suddenly an authoritarian state covertly targets the operations of Federal Civilian Executive Branch (FCEB) agencies with disruptive cyber threats. The attack holds a few missions and essential services as digital hostages and signals the potential to do even worse in an escalating crisis: It’s a bid to panic U.S. leaders and the American public and deter the nation from acting in the interest of national security. Now the president’s decisions on the crisis are harder to make due to the vulnerability of data, devices, and systems at civil government agencies. This potential scenario illustrates the urgency of strengthening federal cybersecurity today.

To get ahead of such threats, the Biden administration is implementing zero trust across the federal enterprise. In this whole-of-government effort, roles can grow over time: Zero trust isn’t a zero-sum game. Now the nation needs the Cybersecurity and Infrastructure Security Agency (CISA) to assume a more visible, practical role helping civilian government agencies with zero trust architecture (ZTA) implementation. Enhancing CISA’s zero trust role this way is one of the recommendations to CISA and Congress in a new independent report published by the Center for Strategic and International Studies (CSIS). The study, which Booz Allen sponsored, serves the public interest: It reviews the current cyber services offered to the FCEB agencies as well as the current and future state of the threat landscape. It also recommends other services that CISA could offer FCEBs for stronger protection.

Civilian agencies have a diverse range of missions, separate budget plans, and unique IT modernization efforts, but they share a requirement to meet specific zero trust goals by the end of fiscal year 2024. CISA has made significant contributions to this effort, including the release this year of an updated Zero Trust Maturity Model. Also, CISA is in the early stages of developing a related technical annex for operational technology (OT). In addition, CISA is exploring the development of new zero trust metrics and measures to augment existing Federal Information Security Modernization Act (FISMA) metrics and assessing how its Continuous Diagnostics and Mitigation (CDM) program could enable automated reporting.

Addressing key challenges

With further tasking and resources, CISA could supply more help to address three major challenges that impede FCEB ZTA implementation:

  1. Agencies need to assess the current state of their zero trust maturity. Right now, most FCEB agencies have given CISA rudimentary zero trust assessments that aren’t well structured and evoke “check the box” compliance.
  2. Agencies need to implement zero trust. CISA has issued several pieces of guidance: These do not dictate a single approach—and that’s fine. CISA should revise its guidance on CDM capability requirements to reflect orchestration and automation objectives, such as conditional access. It should also share those requirements with industry so that original equipment manufacturers (OEM) can demonstrate how their products enable those requirements.
  3. Agencies need to carry out continuous monitoring and reporting. All 93 agencies with a CDM Memorandum of Agreement (MOA) have deployed the CDM Dashboard and are feeding data to CISA. However, there is still further work to do to expand monitoring to more aspects of the enterprise.

Enhancing CISA’s role

So, what would CISA’s enhanced role look like? For starters, here are some ideas:

  • CISA could have a team of zero trust experts engaged with FCEB agencies to supply recommendations on architecture and implementation approaches.
  • What’s more, CISA could work with the Department of Defense (DOD) to see how they are implementing zero trust via the Thunderdome effort. It could also schedule technology exchanges that complement CISA’s ongoing high-level engagement with DOD’s chief information officer (CIO).
  • CISA could expand on nascent efforts to develop specific metrics and measures for zero trust that could be reported in an automated fashion using the CDM Dashboard Ecosystem.

The ZTA recommendation is just one of many pieces of actionable advice in the CSIS report. Another recommendation urges Congress to ensure consistent, coherent, and flexible funding streams for initiatives like the CDM program. CDM helps civilian agencies strengthen their management of assets, user access controls, network security, and data protection, and it enables CISA to respond to cyber threats in a coordinated, accelerated way. Also, the report calls for a study of whether to (and how to) centralize ownership of FCEB networks: By addressing key issues and questions like these, the nation can ensure the federal government is well positioned to build cybersecurity and resilience at scale.

Learn more about Booz Allen’s mission-forward solutions and services as www.BoozAllen.com/Cyber.

The post Expanding CISA’s zero trust role is smart: Here’s why first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/expanding-cisas-zero-trust-role-is-smart-heres-why/feed/ 0
NARA to remove analog records as part of new digitization standards https://federalnewsnetwork.com/federal-insights/2024/06/nara-to-remove-analog-records-as-part-of-new-digitization-standards/ https://federalnewsnetwork.com/federal-insights/2024/06/nara-to-remove-analog-records-as-part-of-new-digitization-standards/#respond Tue, 11 Jun 2024 20:56:55 +0000 https://federalnewsnetwork.com/?p=5036457 The National Archives and Records Administration (NARA) is moving away from analog records and requiring it in digital format. June 30 will be the deadline.

The post NARA to remove analog records as part of new digitization standards first appeared on Federal News Network.

]]>
Federal Insights - Records Management - 06/11/2024

The National Archives and Records Administration will be moving away from analog records and are now requiring agencies to transfer records to them in digital format. NARA’s digitization standard is expected to begin at the end of June 2024.

“The deadline is June 30. In little more than six weeks, there’s going to be a major shift in how NARA accessions records from agencies. Arguably valuable permanent records that are part of our nation’s treasures. We got the biggest set of records covered first. Those standards are very detailed. They are almost like a checklist. And they explained to agencies and the vendors supporting agencies what needs to happen to create that digital image, that version of that permanent record that is coming to NARA. We are not accessioning the paper and the digital image, we are only going to be bringing in the digital image,” said Lisa Haralampus, the director of Federal Records Management Policy and Outreach at NARA, on Federal Insights — Records Management.

NARA recently opened a new digitization center in College Park, Maryland to evolve and provide better access to federal government records and expand its capacity.

“For the last year and a half or so, there was a renovation effort in our archives building at College Park. And we’ve renovated 18,000 square feet and established a modern state-of-the-art digitization center. A mixed use space that colocates our work processes. So archival prep, preparation of records before digitization, metadata capture and then ultimately scanning. We brought the different functions together in one location. We have a fleet of top of the line imaging equipment that ranges from overhead camera setups, flatbed scanners, microfilm and microfiche theatres. And we purchased three new imbl HP FUSiON 8300 high speed [scanners] that will exponentially make more records available online,” said Denise Henderson, director of digitization for the Office of Research Services at NARA.

NARA’s digitization is a multi-part process with different records requiring different techniques to scan and digitize. For agencies, all permanent paper records and print photos must have digital copies of its records as part of the digitization standards for NARA’s archives.

“We have format standards that we use at the National Archives; their records have been created in so many different formats over time by so many agencies depending on what they’re doing. We will take PST files, we will take EML files, we will take XML files, but we won’t see Lotus Notes on that email list. We need the email to be sent to us in a format that we can maintain. Unless your federal mission is really unique, and you are the standards authority, we try to base our standards on what’s common practice. So when we were developing the digitization standards for permanent records, we went and looked, well, what would we base them on? We at the National Archives, our job is to preserve our nation’s history,” Haralampus said.

NARA also requires digitized permanent records to meet Federal Agencies Digital Guidelines Initiative (FADGI) standards in order to be added in the archives. FADGI guidelines set standards for federal agencies to follow as best practices when processing digital historical, archival and cultural content. This includes maps, documents and prints.

“We are a cultural heritage institution. So we are using the Federal Agency Digitization Guidelines, because those were standards that were created to handle cultural heritage materials. The FADGI standards gives us our basis for the technical component of scanning, including things like what is the allowable error for noise. How do you test and make sure that you’ve got a calibrated workstation, so you know your image is what you produce,” Haralampus told the Federal Drive with Tom Temin. “When we wrote these digitization standards, we had the idea of modern textual records in mind; that’s where we started. Eventually, the FADGI standard that we produced would cover any type of record whether it was onionskin from the 1940s or maps. So our standards cover all types of records.”

Modern Textual Records (MTRs) refers to documents created by modern office paper. If the records are before 1950, or if that specific MTR has value, NARA will accept it along with the digital record.

“We created a disposition authority structure that has a check in it. An opportunity for NARA and for the agency and actually members of the public as well to weigh in and say yes, those records, we want to take the source record as well as the digitized record. So for us, modern textual records, we’re not anticipating those as having intrinsic value and coming to the National Archives,” Haralampus said.

When it comes to optical character recognition (OCR), NARA is not requiring that as a standard for agencies to perform. Haralampus said that will be standard to look at in the near future, but as of now they can’t find an OCR standard equivalent to the digitization standard.

“Most agencies are not digitizing records just to send them to the National Archives. They’re digitizing records because they need them to perform their mission. And as they’re performing their mission, the output of that is you should digitize to our permanent record standards. Don’t waste the digitization effort happening across the government,” she said.

The post NARA to remove analog records as part of new digitization standards first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/nara-to-remove-analog-records-as-part-of-new-digitization-standards/feed/ 0
When it comes to AI at Energy, it takes a village https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/ https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/#respond Mon, 10 Jun 2024 14:54:18 +0000 https://federalnewsnetwork.com/?p=5027885 Rob King, the chief data officer at the Energy Department, said a new data strategy and implementation plan will set the tone for using AI in the future.

The post When it comes to AI at Energy, it takes a village first appeared on Federal News Network.

]]>
var config_5038065 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB9260653875.mp3?updated=1718217566"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/03\/EY-Podcast_3000x3000-B-150x150.jpg","title":"When it comes to AI at Energy, it takes a village","description":"[hbidcpodcast podcastid='5038065']nnFederal chief data officers are playing a larger role in how their organizations are adopting and using basic or advanced artificial intelligence (AI).nnA <a href="https:\/\/federalnewsnetwork.com\/big-data\/2023\/12\/chief-data-officers-focused-on-accelerating-ai-adoption-across-government\/">recent survey<\/a> of federal chief data officers by the Data Foundation found over half of the CDOs who responded say their role around AI has significantly changed over the past year, as compared to 2022 when 45% said they had no AI responsibility.nnTaking this a step further, with nearly every agency naming a chief AI officer over the past year, the coordination and collaboration between the CDO and these new leaders has emerged as a key factor in the success of any agency AI program.nn\u201cWe are taking a collaborative and integrated approach to aligning data into artificial intelligence and building synergies between the role of data and data governance, and really being able to meet the spirit of the requirements of the AI executive order, with the ability to interrogate our data ethically and without bias as they are being imported into artificial intelligence models,\u201d said Rob King, the chief data officer at the Energy Department, on the discussion<a href="https:\/\/federalnewsnetwork.com\/government-modernization-unleashed\/"><strong><em> Government Modernization Unleashed: AI Essentials<\/em><\/strong><\/a>. \u201cWe're really now trying to ensure that we can back in the appropriate governance management, make sure we have oversight of our AI inventories and start to align the right controls in place from a metadata management and from a training data standpoint, so that we can meet both the letter and the spirit of the <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2023\/10\/biden-ai-executive-order-calls-for-talent-surge-across-government-to-retain-tech-experts\/">AI executive order<\/a>. We don\u2019t just want to be compliance driven, but ensure that we are doing the right thing to leverage those AI models to their full extent, and make sure that we can accelerate the adoption of them more broadly.\u201dnnFor that adoption that King talks about to happen more broadly and more quickly, data must be prepared, managed and curated to ensure the AI, or really any technology tool, works well.n<h2>CDOs in a unique position<\/h2>nHe said AI is just the latest accelerator that has come along that reemphasizes the importance of understanding and protecting an organization\u2019s data.nn\u201cHow do we use AI to help us look for themes, patterns of usages in our data to advance the classification and tagging of our data from a stewardship standpoint, so that we can understand that whole full cycle? We're calling things like data-centric AI to ensure that we're looking at ways to use non-invasive data governance approaches to help meet the mission needs of AI. It's a great feedback loop,\u201d King said. \u201cWe're using AI to drive the maturity of our processes so that we can advance the mission adoption of AI as well. The CDOs are in a unique position because we live by the tenets of 'it takes a village.' It takes us working with policy and process leaders, and now the chief AI officers (CAIOs) and mission stakeholders, bringing us all together to really drive the outcomes of strong data management practices, now aligned to positioning for AI adoption.\u201dnnKing, who has been <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2023\/01\/ssas-data-pipelines-under-construction-to-feed-digital-transformation\/">the CDO<\/a> at Energy for <a href="https:\/\/www.energy.gov\/cio\/person\/robert-king" target="_blank" rel="noopener">almost a year<\/a>, said policies like the Federal Data Strategy or the Evidence-Based Policymaking Act have created a solid foundation, but the hard work that still must happen will be by CDOs and CAIOs as they put those concepts into action.nnOne way King started down this data management journey is by developing an enterprise data strategy and \u201crecharged\u201d DoE\u2019s data governance board by ensuring all the right stakeholders with the right subject matter expertise and relevancy are participating.nn\u201cWe're on the precipice of completing that strategy. It's been published in a draft format to our entire data governance board members for final review and edit. We hope to bring that to the finish line in the next few weeks,\u201d he said. \u201cFrom there, we're already moving right into a five-year implementation plan, breaking it down by annual increments to promote that strategy, recognizing that our science complex, our weapons complex and our environmental complexes have very different needs.\u201dn<h2>Testing AI has begun<\/h2>nThe new data strategy will lay out what King called the \u201cNorth Star\u201d goals for DoE around data management and governance.nnHe said the strategy details five strategy goals, each with several objectives and related actions.nn\u201cWe wanted to make sure that everyone could see themselves in the strategy. The implementation plan is going to be much more nuanced. We're now taking key stakeholders from our data governance group and building a team with appropriate subject matter experts and mission representatives to build out that implementation plan and to account for those major data types,\u201d he said. \u201cThe other thing we're starting to look at in our strategy is [asking] what is the right ontology for data sharing? We should have a conceptual mission architecture that can show where we can accelerate our missions, be it on the weapons side or on the science and research side. Where can we build ontologies that say we can accelerate the mission? Because we're seeing like functions and like activities that, because of our federated nature at the Department of Energy, we can break down those silos, show where there's that shared equity. That could be some natural data sharing agreements that we could facilitate and accelerate mission functions or science.\u201dnnEven as Energy finalizes its data strategy, its bureaus and labs aren\u2019t waiting to begin testing and piloting AI tools. Energy has several potential and real use cases for AI already under consideration or in the works. King said applying AI to mission critical priorities like moving to a zero trust architecture and in the cyber domain is one example. Another is applying AI to hazards analysis through DoE\u2019s national labs.nnKing said the CDO and CAIO are identifying leaders and then sharing how they are applying AI to other mission areas.nn\u201cI'm trying to partner with them to understand how I can scale and emulate their goodness, both from pure data management standpoint as well as artificial intelligence,\u201d he said. \u201cWe have one that the National Nuclear Security Administration is leading, called Project Alexandra, around non-nuclear proliferation. They're doing a lot of great things. So how do we take that and scale it for its goodness? We are seeing some strategic use cases that are of high importance. The AI executive order says our foundational models need to be published to other government agencies, academia and industry for interrogation. So how do we then start to, with the chief AI officer, say what is our risk assessment? And what is our data quality assessment for being able to publish our foundational models to those stakeholders for that interrogation? How do we start to align our data governance strategy and use cases to some of our AI drivers?\u201d"}};

Federal chief data officers are playing a larger role in how their organizations are adopting and using basic or advanced artificial intelligence (AI).

A recent survey of federal chief data officers by the Data Foundation found over half of the CDOs who responded say their role around AI has significantly changed over the past year, as compared to 2022 when 45% said they had no AI responsibility.

Taking this a step further, with nearly every agency naming a chief AI officer over the past year, the coordination and collaboration between the CDO and these new leaders has emerged as a key factor in the success of any agency AI program.

“We are taking a collaborative and integrated approach to aligning data into artificial intelligence and building synergies between the role of data and data governance, and really being able to meet the spirit of the requirements of the AI executive order, with the ability to interrogate our data ethically and without bias as they are being imported into artificial intelligence models,” said Rob King, the chief data officer at the Energy Department, on the discussion Government Modernization Unleashed: AI Essentials. “We’re really now trying to ensure that we can back in the appropriate governance management, make sure we have oversight of our AI inventories and start to align the right controls in place from a metadata management and from a training data standpoint, so that we can meet both the letter and the spirit of the AI executive order. We don’t just want to be compliance driven, but ensure that we are doing the right thing to leverage those AI models to their full extent, and make sure that we can accelerate the adoption of them more broadly.”

For that adoption that King talks about to happen more broadly and more quickly, data must be prepared, managed and curated to ensure the AI, or really any technology tool, works well.

CDOs in a unique position

He said AI is just the latest accelerator that has come along that reemphasizes the importance of understanding and protecting an organization’s data.

“How do we use AI to help us look for themes, patterns of usages in our data to advance the classification and tagging of our data from a stewardship standpoint, so that we can understand that whole full cycle? We’re calling things like data-centric AI to ensure that we’re looking at ways to use non-invasive data governance approaches to help meet the mission needs of AI. It’s a great feedback loop,” King said. “We’re using AI to drive the maturity of our processes so that we can advance the mission adoption of AI as well. The CDOs are in a unique position because we live by the tenets of ‘it takes a village.’ It takes us working with policy and process leaders, and now the chief AI officers (CAIOs) and mission stakeholders, bringing us all together to really drive the outcomes of strong data management practices, now aligned to positioning for AI adoption.”

King, who has been the CDO at Energy for almost a year, said policies like the Federal Data Strategy or the Evidence-Based Policymaking Act have created a solid foundation, but the hard work that still must happen will be by CDOs and CAIOs as they put those concepts into action.

One way King started down this data management journey is by developing an enterprise data strategy and “recharged” DoE’s data governance board by ensuring all the right stakeholders with the right subject matter expertise and relevancy are participating.

“We’re on the precipice of completing that strategy. It’s been published in a draft format to our entire data governance board members for final review and edit. We hope to bring that to the finish line in the next few weeks,” he said. “From there, we’re already moving right into a five-year implementation plan, breaking it down by annual increments to promote that strategy, recognizing that our science complex, our weapons complex and our environmental complexes have very different needs.”

Testing AI has begun

The new data strategy will lay out what King called the “North Star” goals for DoE around data management and governance.

He said the strategy details five strategy goals, each with several objectives and related actions.

“We wanted to make sure that everyone could see themselves in the strategy. The implementation plan is going to be much more nuanced. We’re now taking key stakeholders from our data governance group and building a team with appropriate subject matter experts and mission representatives to build out that implementation plan and to account for those major data types,” he said. “The other thing we’re starting to look at in our strategy is [asking] what is the right ontology for data sharing? We should have a conceptual mission architecture that can show where we can accelerate our missions, be it on the weapons side or on the science and research side. Where can we build ontologies that say we can accelerate the mission? Because we’re seeing like functions and like activities that, because of our federated nature at the Department of Energy, we can break down those silos, show where there’s that shared equity. That could be some natural data sharing agreements that we could facilitate and accelerate mission functions or science.”

Even as Energy finalizes its data strategy, its bureaus and labs aren’t waiting to begin testing and piloting AI tools. Energy has several potential and real use cases for AI already under consideration or in the works. King said applying AI to mission critical priorities like moving to a zero trust architecture and in the cyber domain is one example. Another is applying AI to hazards analysis through DoE’s national labs.

King said the CDO and CAIO are identifying leaders and then sharing how they are applying AI to other mission areas.

“I’m trying to partner with them to understand how I can scale and emulate their goodness, both from pure data management standpoint as well as artificial intelligence,” he said. “We have one that the National Nuclear Security Administration is leading, called Project Alexandra, around non-nuclear proliferation. They’re doing a lot of great things. So how do we take that and scale it for its goodness? We are seeing some strategic use cases that are of high importance. The AI executive order says our foundational models need to be published to other government agencies, academia and industry for interrogation. So how do we then start to, with the chief AI officer, say what is our risk assessment? And what is our data quality assessment for being able to publish our foundational models to those stakeholders for that interrogation? How do we start to align our data governance strategy and use cases to some of our AI drivers?”

The post When it comes to AI at Energy, it takes a village first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/feed/ 0
The team effort that led to the Marines’ clean audit triumph https://federalnewsnetwork.com/federal-insights/2024/06/the-team-effort-that-led-to-the-marines-clean-audit-triumph/ https://federalnewsnetwork.com/federal-insights/2024/06/the-team-effort-that-led-to-the-marines-clean-audit-triumph/#respond Mon, 10 Jun 2024 14:37:38 +0000 https://federalnewsnetwork.com/?p=5024891 By achieving a clean financial audit for the first time ever, the Marine Corps can provide accountability, transparency and validity for their spending.

The post The team effort that led to the Marines’ clean audit triumph first appeared on Federal News Network.

]]>

The Marine Corps celebrated a much sought after milestone in February: obtaining an unmodified audit opinion for fiscal 2023.

This two-year effort proved that the corps’ 2023 financial statements “present a true and fair reflection of the Marine Corps’ financial information,” which is about $46 billion in total assets.

While audits say there still are seven areas where the Marines still need to improve, Greg Koval, the assistant deputy commandant for resources for the Marine Corps, said this historic feat means, for maybe the first time ever, they can provide accountability, transparency and validity for their spending.

“It gives us transparencies into the cost of production, and in the future, it means the tracking of the cost of maintenance for many of our weapons system platforms,” Koval said on the discussion Marine Corps Milestone: Unqualified Audit Insight. “What it does over time is allow us to really plan, program, budget, execute better and identify those programs, where maybe they cost a little bit more, a little bit less, get those funds to the right place more timely so that we’re better able to execute and give the warfighter what they need to execute the mission. Ultimately on the financial side, we’re here to support them, help them and give them everything they need. So when they deploy, they’ve got the best solution, the best weapons systems they can have at that point in time.”

Like most of the Defense Department, the Marine Corps has been under pressure from Capitol Hill for decades to achieve a clean audit and has been putting more significant resources and focus on the challenges since 2017.

Marines new general ledger system

The Marines came close previously to a clean audit before the 2023 opinion. For example in 2012, the Marines thought it had achieved a “favorable opinion,” only for the DoD inspector general to reverse that decision in 2015.

DoD, as a whole, is targeting fiscal 2028 to achieve a clean financial opinion.

The Marines success demonstrates that it is possible for the largest organizations in DoD to successfully align their data, systems and processes to achieve this goal.

“At the beginning of this journey, we moved to a new general ledger. We had what we called SABRS, which was known and loved across the Marine Corps for over three decades. We took everybody off of that accounting system and moved into this new modern enterprise resources planning (ERP) system, which had its set of challenges. We basically adopted a system that smaller DoD components used, and some of our business processes were new to the system, new to the process, so there was a huge learning curve there,” Koval said. “That learning curve didn’t just impact the financial folks, but they impacted supply and procurement. There were times where we were working hard to pay vendors on time because the system wasn’t working as our old system did. But I think it really brought some additional discipline and internal controls to the financial processes that ultimately helped us understand some of our procurement and logistic processes a little bit better. It really kind of opened our aperture on some of the costs that we were incurring, who we were paying, and it gave us that additional transparency and visibility into the data.”

That major shift in the way the Corps did business, Koval said, really kicked the entire effort into gear by providing the financial team with the agility needed to understand and improve its data.

For any agency or large organization, the big data challenge can be daunting, said Joe Nave, principal federal finance transformation lead for KPMG, which helped the Marines achieve the clean audit opinion.

Analyze and assess risks

“You had to sift through the business processes across the board for geographically dispersed organizations such as the Marine Corps. You look at all of the integration across the rest of the DoD and the different partners outside of the spectrum that the Marine Corps has operational control over, and you start to look at how complex and complicated those processes can really be. From our perspective, it was helping them analyze, assess risk, boil down a couple of them and get the activities that we really needed to accomplish down to a finite list, where we could really focus our efforts and help them move some of these big rocks associated with the material weaknesses and audit deficiencies,” Nave said. “I think over time, you look at the way that the workforce is structured and having to do 100% of the day job, and then you add in some of these audit priorities and you add in some of the samples, we’ve really had to look at ways to modernize and automate those processes to help facilitate quicker reviews, quick requests, quality control ease, and make sure that we’re set up for success and able to respond efficiently and effectively to the audit.”

Nave said moving to the new ERP accounting system played a significant role in helping the Corps adapt processes and procedures as the needs change during the modernization process.

“I like dash boarding as a way to make sure that our clients have the insight that they need to see in real-time where progress is being made, and where progress is being made against those discrete buckets [of goals],” he said. “Then usually, we like tiger teams to assess progress against that. These small, mobile, tactical units, if you will, are going out and solving these problems with brute force, and then focusing on the sustainment of that. That really gets us to our end goal of a modified opinion and being able to continue that modified opinion, year in and year out, layering in that automation and modernization to those tiger team efforts.”

Auditors say the Marines still had seven material weaknesses to resolve.

Koval said a lot of those were on the property side and the need to better integrate data from disparate systems.

“What the audit did for us was really bring those organizations closer together. It broke down a lot of the walls and communications in the way that we work with each other,” he said. “Now, supply, logistics, procurement and accounting all have a better understanding of what we do, how we impact each other and what needs to change to make the organization more efficient, effective and to save costs, frankly, going forward.”

Going forward, among the Marines’ goals are to continue to build upon the previous two-year effort to further integrate processes and systems to make them a more efficient organization.

Nave said the Marine Corps now are set up for long term sustainment because of the process and procedural changes they’ve made and for audit response overall.

The key lessons learned from the Marines’ experience that other military services and organizations can heed, Nave said, is adaptability, being comfortable with the plan, and understanding that plans will change over time.

“We really want systems working for us, not against us. We want to make sure that our IT environment is squared away. We want to make sure that all of the interfaces or feeder systems that we have are clearly laid out. And we’ve looked at the complexity of those different processes and made sure that those all make sense,” he said. “So it is really just a rationalization of your portfolio and trying to make the sandbox smaller. First make sure everything’s in the sandbox, and then what can you do to make it smaller? Then I think leadership must set the tone from the top, cascading that information down and emphasizing the importance. Whether it’s an audit or any other objective you’re trying to accomplish, having that buy-in and tone from the top has been critical.”

The post The team effort that led to the Marines’ clean audit triumph first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/the-team-effort-that-led-to-the-marines-clean-audit-triumph/feed/ 0
How generative AI is cutting down ‘busy work’ and speeding up processing to combat FWA https://federalnewsnetwork.com/federal-insights/2024/06/how-generative-ai-is-cutting-down-busy-work-and-speeding-up-processing-to-combat-fwa/ https://federalnewsnetwork.com/federal-insights/2024/06/how-generative-ai-is-cutting-down-busy-work-and-speeding-up-processing-to-combat-fwa/#respond Thu, 06 Jun 2024 03:47:59 +0000 https://federalnewsnetwork.com/?p=5029652 Optum Serve’s Amanda Warfield tells Federal News Network how agencies are tapping into generative AI to make federal employees even more productive.

The post How generative AI is cutting down ‘busy work’ and speeding up processing to combat FWA first appeared on Federal News Network.

]]>

Leaders across the federal government are seeing generative artificial intelligence and large language models (LLMs) as promising tools that will reshape how agencies deliver on their mission.

The Biden administration is calling on agencies to experiment with GenAI, and is touting the transformative role this emerging technology will have on government work.

“As generative AI products become widely available and common in online platforms, agencies are discouraged from imposing broad general bans or blocks on agency use of generative AI,” President Joe Biden wrote in a sweeping executive order issue in October 2023.

The executive order underscores agencies’ caution over GenAI, but also signals the importance of experimenting with this emerging technology.

Amanda Warfield, the vice president of program integrity at Optum Serve, said agencies see GenAI as a tool that will enable federal employees to become more productive.

“In the last year or so, we’ve really seen an explosion in generative AI,” said Amanda Warfield, the vice president of program integrity at Optum Serve. “And the federal government has really been trying to apply the right set of guidelines around it, and figure out where it is the most valuable, where it can create the most efficiencies.”

Warfield said agencies see generative AI a promising tool to eliminate or reduce manual tasks, while empowering the federal workforce to focus on higher-impact tasks.

“Generative AI is just there to supplement and make those tasks that most people probably don’t like doing — busy work — whether it’s either data entry, or manual review of large documents. [It’s] things that take you a lot of time. What if you had a way to streamline that, to automatically have a tool that’s going to identify the right section of a 1,000-page document for you?” Warfield said. “For employees, they can then spend their time doing more of what their specialized skill is.”

GenAI as ‘policy assistant’

Agencies are identifying GenAI use cases across a wide array for mission areas. Warfield said many agencies see opportunities to use it to provide a better customer experience to the public.

“For a given agency, how can that be applied to help streamline things, make their lives easier when they’re applying for program benefits, or things like that, that really add value and are meaningful to agencies’ missions?” she said. “It’s about being efficient, saving time and money, and then being able to really prioritize the workload that gives you the most value and the most [return-on-investment].”

Warfield said agency watchdogs, including inspector general offices, are also turning to GenAI as a “policy assistant” to tackle a growing workload of fraud cases.

“They have more cases than they can work. They have finite resources. They don’t have agents to work and prosecute every case that comes their way. So imagine being able to apply generative AI to streamline what today is very manual,” she said.

As IG offices evolve their toolkits to stay ahead of fraudsters, Warfield said GenAI helps investigators comb through hundreds — if not thousands — of documents, and to flag anomalies and build evidence in a case of potential fraud case.

“If we’re talking about a provider in health care, it’s looking at eons of claims data and comparing that to policy documentation,  federal regulations and guidelines to essentially prove what the provider did, or what they billed, violated policy — and how can they prove that’s intentional,” Warfield said. “It involves a lot of manual research, combing through data, combing through these large documents, and to empower agents with a tool that that can easily condense down massive amounts of PDF files and documents and all sorts of data into a human-like Q&A format … [on] whatever case they’re prosecuting … it can provide an easy way for anybody who has health care experience or doesn’t to be able to interpret those big documents.”

GenAI can also supplement the skillsets of employees — allowing them, for example, to write code or parse large volumes of data, even if they don’t have a technical background.

“A lot of folks who support fraud, waste and abuse on the downstream side, in looking at cases for potential prosecution or other action, not all of them are technical individuals who know how to query data or write SQL queries or program. But they still have a need to access data, to aggregate data, to look at trends over time. And using generative AI in a way that allows a regular person to just go in and say, ‘Hey, can you tell me how many claims over the last year have been paid using this type of a procedure code?’ And then have that data automatically aggregated for you, or have the query written for you so that you can just go drop it in somewhere, or even produce charts and visualizations for you, that show you that data in a meaningful way that really gives you the insights right off the bat. Those are huge time savers, for individuals who typically would have to refer that to someone else, wait days or weeks to get the data back, it can really speed up that process.”

Warfield said IG shops can also use GenAI to ingest agency-specific user guides and standard operating procedures, so that newer employees can pull up reference materials faster then ever.

“Instead of you having to sit in a six-hour-long training and try to remember where the section was that was relevant to you, you can then use your Generative AI assistant to say ‘Remind me what our SOP is for whatever the process is,’ and be able to pull up that section really quickly — or just have it summarized for you in a nice, easy-to-read response,” she said.

Getting started with GenAI

Agencies see limitless potential — but also plenty of new risks — when it comes to incorporating GenAI into their day-to-day work.

Among the challenges, agencies need to understand the scope of what the algorithms they’re using have been trained to do, and ensure they don’t produce biased results.

“You can’t just go out and take ChatGPT and apply it to magically work for the HHS mission or in Medicare processes. You have to really take an approach that factors in agency-specific data, agency-specific expertise and context,” Warfield said.

Another challenge agencies face is understanding what datasets to train a GenAI algorithm on, and how to set clear boundaries on which data the algorithm can use.

“There has to be a way to ensure that data is always accurate, it’s always current. It’s the latest version that you’re accessing, so that when you actually apply it into your business processes, you’re getting the right answers and the right accuracy,” Warfield said.

Agencies are also thinking about the impact GenAI plays in cybersecurity. Warfield said agencies need to adopt a zero-trust mindset when it comes to fielding AI tools.

“You’re thinking about how the data is going to come in to your federal enclave. How are you going to ensure that the data never leaves your security boundary? What checks and balances do you have, that you can apply upfront, and make sure those are part of your selection criteria, that decisions are being made to factor those in? Those types of things are really important from a security perspective.

GenAI best practices

While agencies have much to consider for adopting GenAI tools, Warfield outlined a few best practices to keep in mind.

Agencies, she said, should consult with experts before deploying any generative AI tools.

“Having a way to select the right large language model for the right use case is really important. It’s not a one-size-fits-all approach. It’s really important to make sure agencies are consulting with the right experts upfront to have that selection criteria defined to make sure those decisions are made in a way that’s really effective,” she said.

Agencies also need to ensure that human employees still maintain decision-making authority, while using GenAI as means of making data-driven decisions faster than ever.

“You still need to make sure there’s a human in the loop, and you’re not just taking whatever the response is by itself,” Warfield said. “That human in the loop oversight is really important to monitoring the results of your generative AI’s answers: making sure they’re continuing to stay accurate, the training or retraining of the models that needs to happen to stay current and refreshed. All those processes have to be built into your overall framework.”

Listen to the full show:

The post How generative AI is cutting down ‘busy work’ and speeding up processing to combat FWA first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/how-generative-ai-is-cutting-down-busy-work-and-speeding-up-processing-to-combat-fwa/feed/ 0
CDM evolution: Ensuring vigilance across a hybrid landscape https://federalnewsnetwork.com/cme-event/federal-insights/cdm-evolution-ensuring-vigilance-across-a-hybrid-landscape/ Wed, 05 Jun 2024 18:03:05 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=5028816 CDM’s ever-evolving role in civilian cyber

The post CDM evolution: Ensuring vigilance across a hybrid landscape first appeared on Federal News Network.

]]>
The Cybersecurity and Infrastructure Security Agency (CISA) continues to expand Continuous Diagnostics and Mitigation Program capabilities to become the tool of first resort for both proactive risk management and also incident response and coordination.

We share insights from CISA as well as from NASA, the National Capital Planning Commission, Small Business Administration and Booz Allen Hamilton.

Download our exclusive ebook now!

The post CDM evolution: Ensuring vigilance across a hybrid landscape first appeared on Federal News Network.

]]>
What’s in your cyber supply chain? https://federalnewsnetwork.com/cme-event/federal-insights/whats-in-your-cyber-supply-chain/ Wed, 05 Jun 2024 17:07:30 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=5028783 Need help on your SCRM journey? We’ve gathered advice from federal and industry experts

The post What’s in your cyber supply chain? first appeared on Federal News Network.

]]>
Federal News Network has gathered advice and tactics about wrangling risk in your supply chains from CISA, DHS’ Silicon Valley Innovation Program, NIST and leading industry experts.

You will hear from:

  • Laurie Locascio, director, National Institute of Standards and Technology
  • Cherilyn Pascoe, director, NIST’s National Cybersecurity Center of Excellence
  • Melissa Oh, managing director, DHS’ Silicon Valley Innovation Program
  • Anil John, technical director, SVIP
  • Eric Goldstein, executive assistant director for cybersecurity, CISA
  • Justin Orcutt, security specialist, aerospace and commercial defense team, Microsoft
  • Chad Sheridan, chief innovation officer, NetImpact Strategies
  • Nick Mistry, senior vice president and chief information security officer, Lineaje

Download our new ebook today!

The post What’s in your cyber supply chain? first appeared on Federal News Network.

]]>
How to manage the digital records deadline https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/ https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/#respond Thu, 30 May 2024 21:06:25 +0000 https://federalnewsnetwork.com/?p=5021262 June 30th deadline approaches, when NARA will only accept digitized documents. Agencies must deal with the largest volume known as modern textual records.

The post How to manage the digital records deadline first appeared on Federal News Network.

]]>

Even since people applied ink to parchment, preserving records has posed challenges. Now federal agencies face a June 30th deadline to digitize certain federal records. The National Archives and Records Administration will require agencies to submit the digitized versions, including metadata for future accessibility. Agencies are moreover obligated to conform to NARA standards in carrying out digitization.

Long in the making and several times delayed, the digital requirement stems ultimately from the never-ending growth in annual production of paper records and resulting volume.

“There’s hundreds of millions of dollars being spent every year by federal agencies to create, manage and store these hardcopy records,” said Anthony Massey, strategic business developer at Canon on Federal Insights Records Management. The digitization directive, Massey said, is designed to make archiving easier and less costly while making records themselves more accessible.

The various types of documents – maps, photographs, items deemed culturally significant, and 8.5 x 11-inch bureaucratic output each have their own associated standards and require different technologies to achieve digitization, Massey noted. Helping inform NARA standards making have been guidelines from the Federal Agencies Digital Guidelines Initiative, or FADGI.

The initiative got underway about 10 years ago “as a concept of how to begin to guide agencies into what kind of a digitization format they could then roadmap their policy and procedure to,” Massey said.

Many digitizing procedures incorporate scanning. Scanning itself has continually advanced, said Tae Chong, Canon’s manager of new business development. One development especially relates to a type of document known as a modern textual record (MTR).

An MTR typically was created electronically, perhaps with a modern word processing program or – as is often the case with older records about to leave agency possession and move to NARA – in a program whose technical format is no longer extant.

That means digitizing a paper printout using scanning. Now, Chong said, scanning technology includes “software engineering techniques to tell the text from the background and … special software image processing to essentially enhance the visibility of the text element, while erasing unwanted graphics on the background.”

A second element in state-of-the-art scanning, Chong said, encompasses optical character recognition that “can kick in to pick up the text information and pass it to a software application which will then index the document for later search and retrieval.”

He noted that agencies must also by law preserve a paper copy. But by extracting the information and indexing it, public retrieval and viewing will no longer require handling the paper itself.

“Thie new regulatory requirement is focusing on creating a digital replica of the paper originals,” Chong said.

Special breed

MTRs differ from cultural heritage documents. In the latter type, the entire area of the document encompasses information to preserve; for example, pieces of artwork or hand-lettered manuscripts. OCR technology won’t yield much information, and the background requires preservation along with whatever else the document exhibits.

“When NARA and the working group of FADGI began to establish classifications of imaging for digitizing these various types of records,” Massey said, “they discovered in that particular context of the printed record, there was a need to get a special type of digitization process called MTR that was simpler, less involved with much less expensive equipment that could do a very high quality image and make it transportable into an archive.”

Because the MTRs exist nearly universally as printed on standard office paper, agencies can apply high speed scanning techniques to them. Massey said agencies have produced billions of MTRs, printing them out as either temporary or permanent records.

For such documents, Massey said, NARA wants in on-line catalog. A researcher with a particular topic “can go to a Library of Congress online catalog and look up that document, instead of having to go in person to a particular storage site or physically go and handle that document.”

While MTR is a process or image standard and not a hardware standard, Massey said Canon has developed scanners specifically for MTR.

“The hardware must then be aligned to those scanning requirements,” he said.

For practical purposes, speed is an important requirement for MTR scanners. Massey said the faster the process occurs, the faster agencies can clear back file projects for older records. For new records, he said agencies should consider establish in-house capability to scan and index records as they create them.

“When records management officers look at day-forward scanning,” Massey said, “knowing that from that day forward they also have to digitize these records, they want access to equipment that can do that at a setting that is confidently MTR capable.”

Listen to the full show:

The post How to manage the digital records deadline first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/feed/ 0
ATF begins looking to new cyber strategies as it nears 100% cloud migration https://federalnewsnetwork.com/federal-insights/2024/05/atf-begins-looking-to-new-cyber-strategies-as-it-nears-100-cloud-migration/ https://federalnewsnetwork.com/federal-insights/2024/05/atf-begins-looking-to-new-cyber-strategies-as-it-nears-100-cloud-migration/#respond Tue, 28 May 2024 18:08:48 +0000 https://federalnewsnetwork.com/?p=5017971 Containerization and automation are two of the tools ATF is looking to use to implement zero trust principles as it re-architects its systems.

The post ATF begins looking to new cyber strategies as it nears 100% cloud migration first appeared on Federal News Network.

]]>
Federal Insights — Best Practices in Secure Software Development — 5/28/24

The Bureau of Alcohol, Tobacco, Firearms and Explosives is only a few months away from having 100% of its systems in the cloud. That’s the culmination of almost eight years of effort, said Mason McDaniel, ATF’s chief technology officer. He said that’s been such a large lift because there are no commercial, off-the-shelf products for missions like criminal investigations, firearms dealer regulations or firearm tracing. And because those systems weren’t compatible with the cloud, ATF needed an environment that allowed them to be rebuilt from the ground up.

“We really refocused on building an enterprise, continuous integration, continuous delivery (CI/CD) environment, rebuilding all of our processes around automation, and really focused on building this pipeline that let us rebuild our applications quickly, efficiently, deploy things quickly, and then we use that as the enabler to go through application by application and try to get those rebuilt. And we are just about at the end of that journey,” McDaniel said on Federal Insights — Best Practices in Secure Software Development.

One key part that McDaniel said ATF prioritized was not changing the business processes, in order to minimize retraining. Instead, ATF focused on wrapping modern frameworks and automation technologies around those, to set the stage for modernizing those business processes as rapidly as possible in the future.

Automating cybersecurity

That also gave ATF the opportunity to embed automated cybersecurity processes throughout the development lifecycle, said ATF Chief Information Security Officer Hillary Carney. That includes penetration testing, endpoint detection and response tools, security information and event management logging tools, and more. That gives developers the feedback they need to address vulnerabilities from test cases through production, as well as lifetime visibility.

“One of the things that I think cloud really helped us with is that near-real time visibility; it allows us to be so much more agile, not only for meeting the business mission need, but for the security testing portion as well,” Carney said. “And being able to interact with the operations teams and say ‘we monitor on a daily basis through our tools. And we’re seeing this change; the posture has changed, and we need you to get in there, and diagnose why that’s happening.’ So cloud has been essential in order to move our program forward, to be a lot more responsive to both mission and then to cybersecurity.”

“But just like the tools have gotten better, so have the adversaries. That’s really what’s driving this. It’s an arms race. So if we are not on top of it, someone else will find it. They will exploit it,” she added. “I am over the moon with the progress we’ve made and being able to do more near real-time analysis, do more agile testing. However, as we get better, they get better. So there is no rest for the weary.”

That’s why the next thing on ATF’s cybersecurity to-do list is to begin using the Cybersecurity and Infrastructure Security Agency’s software attestation form. Eventually, Carney said, the goal is to get to using Software Bills of Materials, but that’s too much of a culture change all at once. She said, much like ATF has done with it’s CI/CD program, the intent is to start slow and build the case as they build the program.

Containerization

But in the meantime, ATF is leveraging its new CI/CD capabilities along with a push toward containerization and virtualization to enhance its systems’ resiliency. McDaniel said using automated deployment and containerization limits the configuration creep of patching, because every new instance is automatically deployed from a known-good state. When paired with ATF’s more frequent deployments, that shrinks the window that adversaries have to create a persistent presence in the systems.

And as ATF uses this method to re-architect its systems, it’s also implementing zero trust principles like least privilege, and continuous verification of identity and authorization. That’s an ongoing process McDaniel said will help ATF protect its application programming interfaces.

“Identity is so foundational to our cloud journey as well as the zero trust mandate. We’ve started some work on device. We’ve made inroads in multiple pillars,” Carney said. “What we need to do now, and we’re trying to drive towards, which is difficult in these constrained budget environments, is really getting that integrated plan to move together, to ensure that we’re taking everything into account as we’re planning our featured architectural state. So it’s a work in progress.”

Information sharing

All of this has been bolstered by increased information sharing among Justice Department components, both Carney and McDaniel said. Many of ATF’s systems are law-enforcement specific; there’s no need for agencies outside DoJ to have them. That limits the applicability of information sharing in wider venues, like the Chief Information Officers Council. But within DoJ, they’re sharing strategies that they find to be more effective than “the traditional, ‘let’s throw 500 FISMA controls at it’” strategies, Carney said.

“So we’ve been figuring a lot of it out as we go and refining our processes and sharing a number of our lessons learned with some of the other components,” McDaniel said. “And then for those that have been on the same path, we’re certainly taking what we can from them. But there’s definitely active lessons learned sharing going on, between all the components.”

The post ATF begins looking to new cyber strategies as it nears 100% cloud migration first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/atf-begins-looking-to-new-cyber-strategies-as-it-nears-100-cloud-migration/feed/ 0
How Leidos manages many thousand endpoints through standardization, PCaaS https://federalnewsnetwork.com/federal-insights/2024/05/how-leidos-manages-many-thousand-endpoints-through-standardization-pcaas/ https://federalnewsnetwork.com/federal-insights/2024/05/how-leidos-manages-many-thousand-endpoints-through-standardization-pcaas/#respond Tue, 28 May 2024 16:42:28 +0000 https://federalnewsnetwork.com/?p=5017787 Leidos enterprise infrastructure leader John Morton shares strategies that aim to balance usability, flexibility and best-of-breed security.

The post How Leidos manages many thousand endpoints through standardization, PCaaS first appeared on Federal News Network.

]]>

This is the first article in our IT lifecycle management series, Delivering the tech that delivers for government.

Supporting employees and teams on the frontlines — the people on government contractor teams helping meet federal missions anytime, anywhere — is a balancing act.

“When you think about the endpoints, these are an extension of our network. A person — they travel, take it on the plane, go to a different office location. It’s a varying number of environments that the endpoint is in,” said John Morton, vice president of enterprise infrastructure at Leidos. “Making sure that we’re keeping security at the forefront — being able to secure our endpoints yet not disrupt the user — it’s a balancing act.”

For Morton and his team that’s a continual element of managing an enterprise infrastructure that 46,000-plus Leidos employees globally on to meet business demands and deliver services to customers worldwide, many of them federal agencies.

It’s a job for which Morton is ideally suited given his experience leading and being part of teams that worked directly with federal agencies to meet their missions.

Morton shared how Leidos tackles these enterprise demands — achieving that balance while keeping an eye on the bottom line — for the Federal News Network series, Delivering the tech that delivers for government.

Deriving benefits from standardization, PCaaS

A chief aid in finding balance, Morton shared, has been to standardize endpoint offerings and implement PC as a service.

A chief benefit of PCaaS really is that it’s a full-fledged offering, he said. “You come in day one, you get an asset, all the software, all the labor, the maintenance, the management — that is all included in our PCaaS offering.”

Plus, while the company has standardized on laptops and desktops and software for endpoints, it provides a variety of devices based on user personas. There is continuity across the endpoint assets but also flexibility to meet specific user needs and adapt to client mission requirements too, Morton said.

“It’s a shared service model from a financial perspective. So as individuals come into the organization and opt in to PCaaS, obviously, it’s a certain cost across the organization,” he said. “That ultimately lowers your per user costs. So there are  financial gains and efficiencies as well.”

Looking to AI to thwart adversarial attacks, proactively manage devices

While reducing friction for users remains critical, protecting corporate assets and data from cyberthreats is no less important.

It’s an area where Morton sees the potential for artificial intelligence to help, particularly with hardware. Leidos has an incubator where teams innovate how to apply AI to everyday needs.

“There’s now a focus from the adversary perspective, where they’re starting to attack below the operating system. When the user hits the [power] button, you’re automatically susceptible to adversarial attacks,” he explained. “We are now starting to work on how we protect the bootup time, the BIOS time, the things that go on below the operating system.”

Teams at Leidos are taking offensive perspective. Criminals are using AI “for adversarial harm and ransomware attacks, thus we’re leveraging our AI capabilities not only at the OS level but at the hardware level through our partnerships,” Morton said.

Another innovative AI tactic? Gathering and analyzing telemetry data on users’ devices to proactively manage and maintain them rather than wait for problems to arise, he said.

“We are gaining insights and observability into those endpoints. … We’re trying to be a little more proactive and flip the script to more of a predictive analysis versus a reactive model — where we’re actually performing some self-healing based on certain tendencies and actions and events that are really occurring across the enterprise.”

It’s about the user — always keep that as your baseline

No matter what new technology he and his team implement, Morton said it always comes back to the employees, the users of the technology and tools.

“It’s centered around the user, the user persona and user experience,” he said. “Are the users happy? Have we done our job and really made it somewhat transparent? Are we providing that workplace of the future, providing digital ambidexterity?”

Ultimately, can a user work no matter what scenario arises?

Again, Morton expects AI and increased telemetry data about users’ devices to pay dividends in ensuring continuity of operations. “These things will give us additional data, will provide better observability and analytics, to make informed business decisions and really increase overall user experience.”

Discover more stories about how federal systems integrators and government contractors manage their enterprise infrastructure environments in our series Delivering the tech that delivers for government, sponsored by Future Tech Enterprise.

To listen to the full discussion between Leidos’ John Morton and Federal News Network’s Vanessa Roberts, click the podcast play button below:

The post How Leidos manages many thousand endpoints through standardization, PCaaS first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/how-leidos-manages-many-thousand-endpoints-through-standardization-pcaas/feed/ 0
‘The people’s agency:’ USDA manages more high-impact services than any other agency https://federalnewsnetwork.com/federal-insights/2024/05/the-peoples-agency-usda-manages-more-high-impact-services-than-any-other-agency/ https://federalnewsnetwork.com/federal-insights/2024/05/the-peoples-agency-usda-manages-more-high-impact-services-than-any-other-agency/#respond Tue, 28 May 2024 15:12:20 +0000 https://federalnewsnetwork.com/?p=5005557 USDA counts six of its component agencies as High-Impact Service Providers (HISPs) — the most of any department across the federal government.

The post ‘The people’s agency:’ USDA manages more high-impact services than any other agency first appeared on Federal News Network.

]]>

The Agriculture Department traces the start of its customer service journey back to its launch in 1862.

President Abraham Lincoln at the time dubbed USDA “the people’s department,” since nearly half of all Americans at the time lived on farms.

USDA’s mission still revolves around delivering services to the public — although its customer base extends well beyond just farmers.  The sprawling department also provides benefits to children and families, as well as the general public.

“We are called the people’s agency, and I think rightly so,” said USDA Chief Customer Experience Officer Simcah Suveyke-Bogin. “It’s numerous amounts of services, but all really participating in that larger ecosystem of our agriculture, and a lot of what we’re having to use day-to-day as a member of the public.”

USDA counts six of its component agencies as High-Impact Service Providers (HISPs) — the most of any department across the federal government.

Those HISP agencies includes Farm Service Agency, National Resource Conservation Agency, Rural Development Agency, Recreation.gov, Forest Service and the Food and Nutrition Service.

When President Joe Biden signed an executive order in December 2021 calling on the entire federal government to step up its customer service to the public, USDA stood out as an agency already achieving these goals.

“We’re so proud to have a department-level focus on customer experience. This allows us to really think across the board, not just one agency at a time, but be able to really collaborate across agencies on what is it that we want to do. Because we have so many different parts of USDA … it is very difficult sometimes to wrap your arms around all the different customers and all the different needs, and improving all the services at once. There’s not one entity that can really do it all,” Suveyke-Bogin said.

Understanding the ‘voice of the customer’

USDA components are also leading their own CX initiatives. Rural Development has its own office dedicated to customer experience, and the Farm Service Agency is spearheading its own initiatives around incorporating customer feedback into service delivery improvements.

“By introducing practices like this, we’re seeing that some of the agencies at USDA have their own focus areas now on customer experience,” Suveyke-Bogin said.

USDA continues to find new ways to deliver a higher level of service to its customers. Earlier this year, the department launched a departmentwide policy on its “Voice of the Customer” program.

“This particular policy really reflected how important it is to be engaging and listening and measuring what our customers are saying, whether it’s about our brand hospitality, whether it’s about a particular service, just so we can introduce a little more mechanism of bringing that voice back into the department, and then allowing us to use it as additional data to make decisions,” Suveyke-Bogin said.

USDA incorporates customer feedback, as well as feedback from its employees, to identify pain points and bottlenecks in the services it provides.

“A lot of [employees] are really trying to deliver great service, and they deliver great customer service to our customers. But there are times where it gets very difficult and understanding it, observing where those hurdles are can really help us — whether it’s from a policy perspective, whether it’s operationally we need to make some changes, where to prioritize some of those adjustments. So, the employees are really a key to understanding all those opportunities.”

The Office of Management and Budget, in its recent guidance on delivering a “digital-first public experience,” said a majority of the public accesses government services online, and that a growing segment of that traffic comes from mobile devices.

To deliver on these goals, USDA recently launched its own digital service to provide a higher level of service to the public online.

“It’s really important that our customers fell that their government is working for them and really trying to make things easier. I think the normal sentiment around using government services is that it’s hard, it’s difficult, it’s time-consuming. And we really want to change that and build trust in. The memo elevates that quite a lot, not just with the HISPs and the high-impact service elements of those projects, but really emphasizing that we do it for all of the digital service delivery aspects of government.”

Building trust through customer experience

USDA is also taking a targeted approach to ensure its CX improvements are equitable and reach historically underserved populations. To meet these goals, the department often partners with academic institutions and community groups that have a more granular understanding of a local population’s needs.

“A lot of different entities already have that trust level, and already have that access to some of the communities that maybe are harder for us to reach. And by partnering them, it’s not only getting the word out, but it’s also us understanding some of that culture, some of the environment that we need to be considering when we deliver some of these services, And for us, that’s really terrific input to figure out, do we need to redesign this next time we put a service out similarly in the next year?” Suveyke-Bogin said.

USDA sees customer service delivery as a unique opportunity to build public trust in government institutions, and that individuals who receive excellent customer service from the department will become repeat customers.

“We’re very keen on making sure that we are elevating the trust level with the public that are using our services, at the end of the day. In order to do that, we have to understand what is their experience today, and baseline that. A big part of our Voices program is really to baseline that, especially for the programs who have never really measured that in a very methodical way, and introducing this type of metrics collection,” Suveyke-Bogin said.

“We always talk to different services at USDA, and we tell them that this is just one element of understanding the performance of your service. This is not the end-all-be-all, but it’s something to layer in with some of the operational key performance indicators that they’re already measuring, to see if they’re meeting their performance levels,” she added.

Discover more about how to elevate your customer experience in the “Excellent, equitable and secure customer experience: A closer look at high-impact service providers” series. 

The post ‘The people’s agency:’ USDA manages more high-impact services than any other agency first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/the-peoples-agency-usda-manages-more-high-impact-services-than-any-other-agency/feed/ 0