Big Data - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Thu, 20 Jun 2024 19:36:35 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Big Data - Federal News Network https://federalnewsnetwork.com 32 32 Robust data management is key to harnessing the power of emerging technologies https://federalnewsnetwork.com/commentary/2024/06/robust-data-management-is-key-to-harnessing-the-power-of-emerging-technologies/ https://federalnewsnetwork.com/commentary/2024/06/robust-data-management-is-key-to-harnessing-the-power-of-emerging-technologies/#respond Thu, 20 Jun 2024 19:36:35 +0000 https://federalnewsnetwork.com/?p=5047635 Comprehensive data management is key to unlocking seamless, personalized and secure CX for government agencies.

The post Robust data management is key to harnessing the power of emerging technologies first appeared on Federal News Network.

]]>
The recent AI Executive Order aptly states that AI reflects the data upon which it is built. Federal agencies are looking to responsibly implement cutting-edge IT innovations such as artificial intelligence, machine learning and robotic process automation to improve customer experiences, bolster cybersecurity and advance mission outcomes. Accessing real-time, actionable data is vital to achieving these essential objectives.

Comprehensive data management is key to unlocking seamless, personalized and secure CX for government agencies. Real-time data empowers informed, rapid decision-making, which can improve critical, high-impact federal services where time is of the essence, such as in response to a natural disaster. Alarmingly, only 13% of federal agency leaders report having access to real-time data, and 73% feel they must do more to leverage the full value of data across their agency.

While some agencies are making progress in their IT modernization journeys, they continue to struggle when it comes to quickly accessing the right data due to numerous factors, from ineffective IT infrastructure to internal cultural barriers.

Actionable intelligence is paramount. The ultimate goal is to access the right data at the right moment to generate insights at “the speed of relevance,” as leaders at the Defense Department would say. To achieve the speed of relevance required to make real-time, data-driven decisions, agencies can take steps to enable quicker access to data, improve their data hygiene, and secure their data.

How to effectively intake and store troves of data

From a data infrastructure perspective, the best path to modernized, real-time deployment is using hyper automation and DevSecOps on cloud infrastructures. Many federal agencies have begun this transition from on-premises to cloud environments, but there’s still a long way to go until this transition is complete government-wide.

Implementing a hybrid, multi-cloud environment offers agencies a secure and cost-effective operating model to propel their data initiatives forward. By embracing standardization and employing cloud-agnostic tools for automation, visibility can be enhanced across systems and environments, while simultaneously adhering to service-level agreements and ensuring the reliability of data platforms. Once a robust infrastructure is in place to store and analyze data, agencies can turn their attention to data ingestion tools.

Despite many agency IT leaders utilizing data ingestion tools such as data lakes and warehouses, silos persist. Agencies can address this interoperability challenge by prioritizing flexible, scalable and holistic data ingestion tools such as data mesh. Data mesh tools, which foster a decentralized data management architecture to improve accessibility, can enable agency decision-makers to capitalize on the full spectrum of available data, while still accommodating unique agency requirements.

To ensure data is accessible to decision-makers, it’s important that the data ingestion mechanism has as many connectors as possible to all sources of data that an agency identifies. Data streaming and data pipelines can also enable real-time insights and facilitate faster decision-making by mitigating manual processes. Data streaming allows data to be ingested from multiple systems, which can build a single source of trust for analytical systems. Additionally, these practices limit data branching and siloes, which can cause issues with data availability, quality and hygiene.

Data hygiene and security enable transformative benefits

Data hygiene is imperative, particularly when striving to ethically and accurately utilize data for an autonomous system like AI or ML. A robust data validation framework is necessary to improve data quality. To create this framework, agencies can map their data’s source systems and determine the types of data they expect to yield, but mapping becomes increasingly arduous as databases continue to scale.

One critical success factor is to understand the nature of the data and the necessary validations prior to ingesting the data into source systems. Hygiene can be improved by consuming the raw data into a data lake and then, during conversion, validate the data’s quality before applying any analytics or crafting insights.

In addition to data hygiene, data security must remain a top priority across the federal government as agencies move toward real-time data insights. Adopting a hybrid, multi-cloud environment can lead to a stronger security posture because there are data encryption capabilities inherent in enterprise cloud environments.

Agencies may consider using a maturity model to help their teams assess data readiness and how they are progressing in their cybersecurity frameworks. A maturity model lets agencies identify and understand specific security gaps at each level of the model and provides a roadmap to address these gaps. Ultimately, the cybersecurity framework is as essential as data hygiene to ensure agencies can harness data reliably and efficiently.

When agencies have data management solutions that reduce the friction of navigating siloed government systems and enable faster, more secure collaboration, it enables them to drive innovation. This is especially true for agencies that handle extensive amounts of data. For example, many High Impact Service Providers (HISPs) must manage vast amounts of citizen data to provide critical, public-facing services with speed and scale.

Data is the foundation for modern digital government services. Once data is ingested, stored and secured effectively, the transformational potential of emerging technologies such as AI or RPA can be unlocked. Moreover, with real-time data insights, government decision-makers can use actionable intelligence to improve federal services. It’s essential that agency IT leaders invest in a robust data management strategy and modern data tools to ensure they can make informed decisions and benefit from the power of AI to achieve mission-critical outcomes for the American public.

Joe Jeter is senior vice president of federal technology at Maximus.

The post Robust data management is key to harnessing the power of emerging technologies first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/06/robust-data-management-is-key-to-harnessing-the-power-of-emerging-technologies/feed/ 0
Countdown to Compliance: Understanding NARA’s rules for text messaging https://federalnewsnetwork.com/federal-insights/2024/06/countdown-to-compliance-understanding-naras-rules-for-text-messaging/ https://federalnewsnetwork.com/federal-insights/2024/06/countdown-to-compliance-understanding-naras-rules-for-text-messaging/#respond Mon, 17 Jun 2024 18:03:41 +0000 https://federalnewsnetwork.com/?p=5043700 Federal agencies have just weeks to prepare for changes in digital record standards, here’s how agencies can help ensure compliance.

The post Countdown to Compliance: Understanding NARA’s rules for text messaging first appeared on Federal News Network.

]]>
This content was written by George Fischer, Senior Vice President, Sales, T-Mobile Business Group.

A pivotal deadline looms large for federal agencies: digital message compliance. Starting June 30, 2024, all federal agencies will be required to archive agency records digitally. That’s right, the end of paper culture is finally here. But for far too long, there’s been a lot of confusion over what exactly digital messages entail, what the expectations are for reporting them and a lack of tools to submit data to the National Archives and Records Administration (NARA) in a way that’s easy, secure and proactive. And to further complicate things, NARA broadened the meaning of digital messages in January 2023 to include text messages, so there’s yet another factor to consider. Let’s focus on text messages since that’s the latest addition.

So many important business transactions, official policies and decisions are done via texting, so failing to archive them can make it hard to stay transparent, accountable and in compliance with the law. Once NARA’s deadline hits, non-compliance can lead to information gaps during federal investigations, create PR headaches, and potentially result in substantial fines and penalties.

Understanding NARA’s text message regulations

So, what exactly is NARA tracking? Text messages from federal workers are deemed public records and must be archived. NARA is expecting access to digital messages sent or received by federal employees, including SMS and MMS messages – meaning photos, videos, voice notes and even emojis. Considering these different types of messages plus the fact that they need to be monitored across agency networks, personal devices and different phone operating systems like Android and iOS means there are several layers of complexity to navigate.

NARA also expects agencies to retain metadata associated with these texts. Including timestamps, device information, attachments and even emoji reactions. Yes, you read that right – even a simple thumbs-up emoji might serve as evidence in a federal case.

The NARA guidelines recommend evaluating whether messages need to be archived based on whether they contain:

  • Evidence of agency policies, business or mission
  • Information that is exclusively available in electronic messages
  • Official agency information
  • A business need for the information

It’s clear that federal agencies are facing an increasingly complex and dynamic digital landscape filled with constantly changing expectations. Outdated processes are no match for this complexity, and they’re holding agencies back from staying compliant. The answer? Solutions that do the heavy lifting. To make life easier, agencies should have a platform that automatically captures text messages, images, and videos, and is tightly integrated with their wireless provider. It should also leverage the latest security protocols and make it easy to generate reports that are audit ready.

Streamlining text message archiving for federal agencies

Companies like 3rd Eye Technologies have spent years perfecting a solution to keep data safe for federal organizations, including agencies in charge of the highest levels of national security and intelligence. That’s why T-Mobile teamed up with the mobile solutions provider to make it as easy as possible for federal customers to know that the data they’re archiving is not only easy to manage but safe.

Mystic Messaging Archival is a turn-key solution from 3rd Eye Technologies that specializes in securely capturing and archiving texts – that includes SMS and MMS message logs for federal and enterprise customers. Mystic is fully integrated into the T-Mobile network, meaning there is no need for any additional applications or software on the phone, making implementation across the agency simple and swift once the agency purchases the solution from 3rd Eye Technologies or T-Mobile. And because the solution is configured at the network level, it is archiving every SMS/MMS message in real-time and is storing them securely for reporting, so the messages do not need to be self-reported unless specified by agency protocols. The messages then travel over 5G where they’re archived in a hosted cloud and the data remains owned by the agency.

Mystic’s cloud-based solution is also “FedRAMP Ready” in the FedRAMP marketplace, which means it is ready for Agency Authority to Operate (ATO). Not all archiving solutions have that distinction due to the highly rigorous standards involved, so it’s a major advantage. And when pairing Mystic technology with T-Mobile’s nationwide 5G network and 5G standalone technology, messages are transmitted over a secure channel, enhancing protection against vulnerabilities such as cyber attacks (commonly found in Wi-Fi networks).

Mystic also ensures that SMS/MMS data from any lost, stolen, or damaged mobile device is automatically archived, safeguarding information despite the physical status of the device.

Preparing for NARA compliance

Mystic’s eDiscovery console – the mechanism that actually generates the reports — is designed to streamline the entire process of collecting, storing, managing, securing and reviewing text messages from mobile devices. This centralized reporting console consolidates all data from subscribed agency enterprise mobile devices. This console is accessible by the Agency Headquarters, allowing for efficient management and oversight of all archived communications. This way agencies can quickly and easily respond to all types of legal requests, investigations or regulatory requirements. And because Mystic and T-Mobile are already tightly integrated through the 5G network, getting set up takes only 10 days or less.

Here’s the bottom line: agencies need to move fast. The NARA deadline is close and the right tools and partners will make all the difference in preparing for it. The clock is ticking, but it’s not too late to get ahead of the game with a solution that makes text archiving easy, integrates into your existing processes seamlessly and stays up to date with the latest guidelines so you don’t have to.

The post Countdown to Compliance: Understanding NARA’s rules for text messaging first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/countdown-to-compliance-understanding-naras-rules-for-text-messaging/feed/ 0
When it comes to AI at Energy, it takes a village https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/ https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/#respond Mon, 10 Jun 2024 14:54:18 +0000 https://federalnewsnetwork.com/?p=5027885 Rob King, the chief data officer at the Energy Department, said a new data strategy and implementation plan will set the tone for using AI in the future.

The post When it comes to AI at Energy, it takes a village first appeared on Federal News Network.

]]>
var config_5038065 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB9260653875.mp3?updated=1718217566"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/03\/EY-Podcast_3000x3000-B-150x150.jpg","title":"When it comes to AI at Energy, it takes a village","description":"[hbidcpodcast podcastid='5038065']nnFederal chief data officers are playing a larger role in how their organizations are adopting and using basic or advanced artificial intelligence (AI).nnA <a href="https:\/\/federalnewsnetwork.com\/big-data\/2023\/12\/chief-data-officers-focused-on-accelerating-ai-adoption-across-government\/">recent survey<\/a> of federal chief data officers by the Data Foundation found over half of the CDOs who responded say their role around AI has significantly changed over the past year, as compared to 2022 when 45% said they had no AI responsibility.nnTaking this a step further, with nearly every agency naming a chief AI officer over the past year, the coordination and collaboration between the CDO and these new leaders has emerged as a key factor in the success of any agency AI program.nn\u201cWe are taking a collaborative and integrated approach to aligning data into artificial intelligence and building synergies between the role of data and data governance, and really being able to meet the spirit of the requirements of the AI executive order, with the ability to interrogate our data ethically and without bias as they are being imported into artificial intelligence models,\u201d said Rob King, the chief data officer at the Energy Department, on the discussion<a href="https:\/\/federalnewsnetwork.com\/government-modernization-unleashed\/"><strong><em> Government Modernization Unleashed: AI Essentials<\/em><\/strong><\/a>. \u201cWe're really now trying to ensure that we can back in the appropriate governance management, make sure we have oversight of our AI inventories and start to align the right controls in place from a metadata management and from a training data standpoint, so that we can meet both the letter and the spirit of the <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2023\/10\/biden-ai-executive-order-calls-for-talent-surge-across-government-to-retain-tech-experts\/">AI executive order<\/a>. We don\u2019t just want to be compliance driven, but ensure that we are doing the right thing to leverage those AI models to their full extent, and make sure that we can accelerate the adoption of them more broadly.\u201dnnFor that adoption that King talks about to happen more broadly and more quickly, data must be prepared, managed and curated to ensure the AI, or really any technology tool, works well.n<h2>CDOs in a unique position<\/h2>nHe said AI is just the latest accelerator that has come along that reemphasizes the importance of understanding and protecting an organization\u2019s data.nn\u201cHow do we use AI to help us look for themes, patterns of usages in our data to advance the classification and tagging of our data from a stewardship standpoint, so that we can understand that whole full cycle? We're calling things like data-centric AI to ensure that we're looking at ways to use non-invasive data governance approaches to help meet the mission needs of AI. It's a great feedback loop,\u201d King said. \u201cWe're using AI to drive the maturity of our processes so that we can advance the mission adoption of AI as well. The CDOs are in a unique position because we live by the tenets of 'it takes a village.' It takes us working with policy and process leaders, and now the chief AI officers (CAIOs) and mission stakeholders, bringing us all together to really drive the outcomes of strong data management practices, now aligned to positioning for AI adoption.\u201dnnKing, who has been <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2023\/01\/ssas-data-pipelines-under-construction-to-feed-digital-transformation\/">the CDO<\/a> at Energy for <a href="https:\/\/www.energy.gov\/cio\/person\/robert-king" target="_blank" rel="noopener">almost a year<\/a>, said policies like the Federal Data Strategy or the Evidence-Based Policymaking Act have created a solid foundation, but the hard work that still must happen will be by CDOs and CAIOs as they put those concepts into action.nnOne way King started down this data management journey is by developing an enterprise data strategy and \u201crecharged\u201d DoE\u2019s data governance board by ensuring all the right stakeholders with the right subject matter expertise and relevancy are participating.nn\u201cWe're on the precipice of completing that strategy. It's been published in a draft format to our entire data governance board members for final review and edit. We hope to bring that to the finish line in the next few weeks,\u201d he said. \u201cFrom there, we're already moving right into a five-year implementation plan, breaking it down by annual increments to promote that strategy, recognizing that our science complex, our weapons complex and our environmental complexes have very different needs.\u201dn<h2>Testing AI has begun<\/h2>nThe new data strategy will lay out what King called the \u201cNorth Star\u201d goals for DoE around data management and governance.nnHe said the strategy details five strategy goals, each with several objectives and related actions.nn\u201cWe wanted to make sure that everyone could see themselves in the strategy. The implementation plan is going to be much more nuanced. We're now taking key stakeholders from our data governance group and building a team with appropriate subject matter experts and mission representatives to build out that implementation plan and to account for those major data types,\u201d he said. \u201cThe other thing we're starting to look at in our strategy is [asking] what is the right ontology for data sharing? We should have a conceptual mission architecture that can show where we can accelerate our missions, be it on the weapons side or on the science and research side. Where can we build ontologies that say we can accelerate the mission? Because we're seeing like functions and like activities that, because of our federated nature at the Department of Energy, we can break down those silos, show where there's that shared equity. That could be some natural data sharing agreements that we could facilitate and accelerate mission functions or science.\u201dnnEven as Energy finalizes its data strategy, its bureaus and labs aren\u2019t waiting to begin testing and piloting AI tools. Energy has several potential and real use cases for AI already under consideration or in the works. King said applying AI to mission critical priorities like moving to a zero trust architecture and in the cyber domain is one example. Another is applying AI to hazards analysis through DoE\u2019s national labs.nnKing said the CDO and CAIO are identifying leaders and then sharing how they are applying AI to other mission areas.nn\u201cI'm trying to partner with them to understand how I can scale and emulate their goodness, both from pure data management standpoint as well as artificial intelligence,\u201d he said. \u201cWe have one that the National Nuclear Security Administration is leading, called Project Alexandra, around non-nuclear proliferation. They're doing a lot of great things. So how do we take that and scale it for its goodness? We are seeing some strategic use cases that are of high importance. The AI executive order says our foundational models need to be published to other government agencies, academia and industry for interrogation. So how do we then start to, with the chief AI officer, say what is our risk assessment? And what is our data quality assessment for being able to publish our foundational models to those stakeholders for that interrogation? How do we start to align our data governance strategy and use cases to some of our AI drivers?\u201d"}};

Federal chief data officers are playing a larger role in how their organizations are adopting and using basic or advanced artificial intelligence (AI).

A recent survey of federal chief data officers by the Data Foundation found over half of the CDOs who responded say their role around AI has significantly changed over the past year, as compared to 2022 when 45% said they had no AI responsibility.

Taking this a step further, with nearly every agency naming a chief AI officer over the past year, the coordination and collaboration between the CDO and these new leaders has emerged as a key factor in the success of any agency AI program.

“We are taking a collaborative and integrated approach to aligning data into artificial intelligence and building synergies between the role of data and data governance, and really being able to meet the spirit of the requirements of the AI executive order, with the ability to interrogate our data ethically and without bias as they are being imported into artificial intelligence models,” said Rob King, the chief data officer at the Energy Department, on the discussion Government Modernization Unleashed: AI Essentials. “We’re really now trying to ensure that we can back in the appropriate governance management, make sure we have oversight of our AI inventories and start to align the right controls in place from a metadata management and from a training data standpoint, so that we can meet both the letter and the spirit of the AI executive order. We don’t just want to be compliance driven, but ensure that we are doing the right thing to leverage those AI models to their full extent, and make sure that we can accelerate the adoption of them more broadly.”

For that adoption that King talks about to happen more broadly and more quickly, data must be prepared, managed and curated to ensure the AI, or really any technology tool, works well.

CDOs in a unique position

He said AI is just the latest accelerator that has come along that reemphasizes the importance of understanding and protecting an organization’s data.

“How do we use AI to help us look for themes, patterns of usages in our data to advance the classification and tagging of our data from a stewardship standpoint, so that we can understand that whole full cycle? We’re calling things like data-centric AI to ensure that we’re looking at ways to use non-invasive data governance approaches to help meet the mission needs of AI. It’s a great feedback loop,” King said. “We’re using AI to drive the maturity of our processes so that we can advance the mission adoption of AI as well. The CDOs are in a unique position because we live by the tenets of ‘it takes a village.’ It takes us working with policy and process leaders, and now the chief AI officers (CAIOs) and mission stakeholders, bringing us all together to really drive the outcomes of strong data management practices, now aligned to positioning for AI adoption.”

King, who has been the CDO at Energy for almost a year, said policies like the Federal Data Strategy or the Evidence-Based Policymaking Act have created a solid foundation, but the hard work that still must happen will be by CDOs and CAIOs as they put those concepts into action.

One way King started down this data management journey is by developing an enterprise data strategy and “recharged” DoE’s data governance board by ensuring all the right stakeholders with the right subject matter expertise and relevancy are participating.

“We’re on the precipice of completing that strategy. It’s been published in a draft format to our entire data governance board members for final review and edit. We hope to bring that to the finish line in the next few weeks,” he said. “From there, we’re already moving right into a five-year implementation plan, breaking it down by annual increments to promote that strategy, recognizing that our science complex, our weapons complex and our environmental complexes have very different needs.”

Testing AI has begun

The new data strategy will lay out what King called the “North Star” goals for DoE around data management and governance.

He said the strategy details five strategy goals, each with several objectives and related actions.

“We wanted to make sure that everyone could see themselves in the strategy. The implementation plan is going to be much more nuanced. We’re now taking key stakeholders from our data governance group and building a team with appropriate subject matter experts and mission representatives to build out that implementation plan and to account for those major data types,” he said. “The other thing we’re starting to look at in our strategy is [asking] what is the right ontology for data sharing? We should have a conceptual mission architecture that can show where we can accelerate our missions, be it on the weapons side or on the science and research side. Where can we build ontologies that say we can accelerate the mission? Because we’re seeing like functions and like activities that, because of our federated nature at the Department of Energy, we can break down those silos, show where there’s that shared equity. That could be some natural data sharing agreements that we could facilitate and accelerate mission functions or science.”

Even as Energy finalizes its data strategy, its bureaus and labs aren’t waiting to begin testing and piloting AI tools. Energy has several potential and real use cases for AI already under consideration or in the works. King said applying AI to mission critical priorities like moving to a zero trust architecture and in the cyber domain is one example. Another is applying AI to hazards analysis through DoE’s national labs.

King said the CDO and CAIO are identifying leaders and then sharing how they are applying AI to other mission areas.

“I’m trying to partner with them to understand how I can scale and emulate their goodness, both from pure data management standpoint as well as artificial intelligence,” he said. “We have one that the National Nuclear Security Administration is leading, called Project Alexandra, around non-nuclear proliferation. They’re doing a lot of great things. So how do we take that and scale it for its goodness? We are seeing some strategic use cases that are of high importance. The AI executive order says our foundational models need to be published to other government agencies, academia and industry for interrogation. So how do we then start to, with the chief AI officer, say what is our risk assessment? And what is our data quality assessment for being able to publish our foundational models to those stakeholders for that interrogation? How do we start to align our data governance strategy and use cases to some of our AI drivers?”

The post When it comes to AI at Energy, it takes a village first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/feed/ 0
Gen. Rey reflects on leading Network Cross Functional team https://federalnewsnetwork.com/army/2024/06/gen-rey-reflects-on-leading-network-cross-functional-team/ https://federalnewsnetwork.com/army/2024/06/gen-rey-reflects-on-leading-network-cross-functional-team/#respond Thu, 06 Jun 2024 18:31:37 +0000 https://federalnewsnetwork.com/?p=5030506 Maj. Gen. Jeth Rey focused on four pillars, including agnostics transport and moving the Army toward a data-centric environment, over the last three years.

The post Gen. Rey reflects on leading Network Cross Functional team first appeared on Federal News Network.

]]>
var config_5030698 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3268324857.mp3?updated=1717699192"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Gen. Rey reflects on leading Network Cross Functional team","description":"[hbidcpodcast podcastid='5030698']nnMaj. Gen. Jeth Rey ended his three-year tenure as the director of the Army\u2019s Network Cross Functional team last week. When he started in 2021, Rey laid out a four-pronged vision to move the Army toward a data-centric environment.nnRey, who moved to a new job at the Pentagon as the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6, said the Army has made tremendous progress to become a data-centric organization over the last three years.nn[caption id="attachment_5030549" align="alignright" width="474"]<img class="wp-image-5030549 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/06\/jeth-rey.jpg" alt="" width="474" height="419" \/> Maj. Gen. Jeth Rey ended his three-year tenure as the director of the Army\u2019s Network Cross Functional team and is now the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6.[\/caption]nn\u201cThe problem said that we had in the Army, and across DoD, is we didn't have a data problem, we had a data management problem,\u201d Rey said in an interview at the Army TEMS conference. \u201cTherefore, we tried to find a way to get to data centric using agnostic transport to move the data as freely as possible to where it needs to go, a cloud-enabled asset to catch and move the data, and then, obviously, you needed a layered security architecture. We wanted a multi-level security architecture where we can move the data from one classification to another seamlessly.\u201dnn nnBrig. Gen. Patrick Ellis, the former deputy chief of staff, G-3 for the Army Europe-Africa <a href="https:\/\/www.defense.gov\/News\/Releases\/Release\/Article\/3693728\/general-officer-assignments\/">took over<\/a> for Rey in early June.nnUnder the Network Cross Functional team, Rey\u2019s four pillars were:n<ul>n \t<li>Agnostic transport<\/li>n \t<li>Moving to a data-centric environment from a network-centric environment<\/li>n \t<li>Implementing a multi-level security architecture to include a zero trust architecture<\/li>n \t<li>Ensuring cybersecurity is considered early as part of system development<\/li>n<\/ul>nRey said he worked closely with Army Program Executive Office Command, Control and Communications Tactical (PEO-C3T) and the Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance and Reconnaissance (C5ISR) Center in the Army Combat Capabilities Development Command to take the vision and make it into a reality.nn\u201cMy role is setting the vision and then keeping the momentum going forward. I would set a timeframe that I would want to see a part of the project achieved, and then I just continue to drive the momentum going forward,\u201d Rey said. \u201cWe are the influencers as the Network Cross Functional team to get to the end state and keep people focusing on track.\u201dn<h2>Army's transport is now multi-threaded<\/h2>nThe Army demonstrated its progress in advancing these capabilities over the past few years at Project Convergence and NetModX, which is one of their major exercises that is run by the C5ISR.nnRey said one way the Army is better off than it was three years ago is how it processes data across multiple infrastructure approaches.nnAt one time, the soldiers could only use one type of approach, or single threaded, such as only using Geostationary Operational Environmental Satellites (GOES).nnHe said the C5ISR office created an automate planning for primary, alternate, contingency and emergency (PACE) communications plan to create the multiple threaded approach to transport.nn\u201cI wanted to see if there was a way to automate pace that we could go from 5G to low Earth orbit (LEO) satellite to GOES to medium Earth orbit (MEO) satellites. I think, three years later, we are almost there as an accomplishment when it comes to that part of our pillar,\u201d Rey said.nnA second pillar where Rey believes the Army has made significant progress in is moving to a data-centric environment. He said the advancements in the network architecture is a big part of this change.nn\u201cI believe that the way data is being approached today is a little different. I think what we need to think about is the way we create data because today data is stored on your laptop or it's stored on your phone or it is stored in a data center or it stored in the cloud. It\u2019s still really siloed, and from my perspective, we need more of a in a large data fabric where we can catch and make sense of data by using artificial intelligence and machine learning,\u201d he said. \u201cWe need open application programming interfaces (APIs) in order for us to be able to share data. If we get to a point where I\u2019d like down to the attribute base level of data sharing. Until we actually get there, we will continue to have data siloed the way we are today.\u201dnnThe Army took a big step in this direction in January, <a href="https:\/\/federalnewsnetwork.com\/army\/2024\/01\/army-implementing-new-data-architecture-launching-innovation-exchange-lab-next-month\/">starting to implement<\/a> its unified data reference architecture (UDRA). The service recently completed version 1.0 of the UDRA while also building out an implementation plan of the framework in partnership with the Army Combat Capabilities Development Command (DEVCOM).n<h2>Keep the momentum going<\/h2>nThe Army expects UDRA to bring together principles and efforts for data mesh and data fabric. While data mesh involves a decentralized approach where data product ownership is distributed across teams and domains, the data platform will facilitate seamless access and integration of data products from different formats and locations.nnRey said the concepts that make the <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2022\/10\/as-data-fabric-comes-together-army-must-ensure-platforms-integrate\/">data mesh and data fabric<\/a> work go back to creating a unified network, especially in the tactical environment.nn\u201cThere are two separate areas that we're trying to unify together. In the tactical space is where we believe the data fabric is more important for us today because of all the sensors that are on the battlefield and in order to make sense of the information that's out there,\u201d he said. \u201cThat is the catcher's mitt that needs to ingest the data, use analytics and then egress of data for the commander to make an informed decision across the board. I think we're we have a lot of momentum right now. We've talked about the next generation of command and control systems that's coming, and that's going to be an ecosystem that allows us to really have a more robust type of data environment that will move data and echelon.\u201dnnArmy Chief of Staff Gen. Randy George on May 28 <a href="https:\/\/federalnewsnetwork.com\/army\/2024\/06\/agile-adaptable-modular-the-future-of-army-c2\/">signed off<\/a> on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).nnRey said creating data in a way that also foresees wanting to share it remains one of the biggest challenges for the Army.nn\u201cThe only way you can share it is if we decide what those attributes are going to look like, whether I'm with a partner or whether I'm just dealing with a US entity,\u201d he said. \u201cSo, attributes are going to be key with how we tag label the data, and then be an in are able to share it at the end of the onset.\u201dnnAs for the new director of the Network Cross Functional team Rey said his advice to Ellis was simple: \u201cDon't allow the momentum to slow down.\u201d"}};

Maj. Gen. Jeth Rey ended his three-year tenure as the director of the Army’s Network Cross Functional team last week. When he started in 2021, Rey laid out a four-pronged vision to move the Army toward a data-centric environment.

Rey, who moved to a new job at the Pentagon as the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6, said the Army has made tremendous progress to become a data-centric organization over the last three years.

Maj. Gen. Jeth Rey ended his three-year tenure as the director of the Army’s Network Cross Functional team and is now the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6.

“The problem said that we had in the Army, and across DoD, is we didn’t have a data problem, we had a data management problem,” Rey said in an interview at the Army TEMS conference. “Therefore, we tried to find a way to get to data centric using agnostic transport to move the data as freely as possible to where it needs to go, a cloud-enabled asset to catch and move the data, and then, obviously, you needed a layered security architecture. We wanted a multi-level security architecture where we can move the data from one classification to another seamlessly.”

 

Brig. Gen. Patrick Ellis, the former deputy chief of staff, G-3 for the Army Europe-Africa took over for Rey in early June.

Under the Network Cross Functional team, Rey’s four pillars were:

  • Agnostic transport
  • Moving to a data-centric environment from a network-centric environment
  • Implementing a multi-level security architecture to include a zero trust architecture
  • Ensuring cybersecurity is considered early as part of system development

Rey said he worked closely with Army Program Executive Office Command, Control and Communications Tactical (PEO-C3T) and the Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance and Reconnaissance (C5ISR) Center in the Army Combat Capabilities Development Command to take the vision and make it into a reality.

“My role is setting the vision and then keeping the momentum going forward. I would set a timeframe that I would want to see a part of the project achieved, and then I just continue to drive the momentum going forward,” Rey said. “We are the influencers as the Network Cross Functional team to get to the end state and keep people focusing on track.”

Army’s transport is now multi-threaded

The Army demonstrated its progress in advancing these capabilities over the past few years at Project Convergence and NetModX, which is one of their major exercises that is run by the C5ISR.

Rey said one way the Army is better off than it was three years ago is how it processes data across multiple infrastructure approaches.

At one time, the soldiers could only use one type of approach, or single threaded, such as only using Geostationary Operational Environmental Satellites (GOES).

He said the C5ISR office created an automate planning for primary, alternate, contingency and emergency (PACE) communications plan to create the multiple threaded approach to transport.

“I wanted to see if there was a way to automate pace that we could go from 5G to low Earth orbit (LEO) satellite to GOES to medium Earth orbit (MEO) satellites. I think, three years later, we are almost there as an accomplishment when it comes to that part of our pillar,” Rey said.

A second pillar where Rey believes the Army has made significant progress in is moving to a data-centric environment. He said the advancements in the network architecture is a big part of this change.

“I believe that the way data is being approached today is a little different. I think what we need to think about is the way we create data because today data is stored on your laptop or it’s stored on your phone or it is stored in a data center or it stored in the cloud. It’s still really siloed, and from my perspective, we need more of a in a large data fabric where we can catch and make sense of data by using artificial intelligence and machine learning,” he said. “We need open application programming interfaces (APIs) in order for us to be able to share data. If we get to a point where I’d like down to the attribute base level of data sharing. Until we actually get there, we will continue to have data siloed the way we are today.”

The Army took a big step in this direction in January, starting to implement its unified data reference architecture (UDRA). The service recently completed version 1.0 of the UDRA while also building out an implementation plan of the framework in partnership with the Army Combat Capabilities Development Command (DEVCOM).

Keep the momentum going

The Army expects UDRA to bring together principles and efforts for data mesh and data fabric. While data mesh involves a decentralized approach where data product ownership is distributed across teams and domains, the data platform will facilitate seamless access and integration of data products from different formats and locations.

Rey said the concepts that make the data mesh and data fabric work go back to creating a unified network, especially in the tactical environment.

“There are two separate areas that we’re trying to unify together. In the tactical space is where we believe the data fabric is more important for us today because of all the sensors that are on the battlefield and in order to make sense of the information that’s out there,” he said. “That is the catcher’s mitt that needs to ingest the data, use analytics and then egress of data for the commander to make an informed decision across the board. I think we’re we have a lot of momentum right now. We’ve talked about the next generation of command and control systems that’s coming, and that’s going to be an ecosystem that allows us to really have a more robust type of data environment that will move data and echelon.”

Army Chief of Staff Gen. Randy George on May 28 signed off on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).

Rey said creating data in a way that also foresees wanting to share it remains one of the biggest challenges for the Army.

“The only way you can share it is if we decide what those attributes are going to look like, whether I’m with a partner or whether I’m just dealing with a US entity,” he said. “So, attributes are going to be key with how we tag label the data, and then be an in are able to share it at the end of the onset.”

As for the new director of the Network Cross Functional team Rey said his advice to Ellis was simple: “Don’t allow the momentum to slow down.”

The post Gen. Rey reflects on leading Network Cross Functional team first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/army/2024/06/gen-rey-reflects-on-leading-network-cross-functional-team/feed/ 0
Agile, adaptable, modular: The future of Army C2 https://federalnewsnetwork.com/army/2024/06/agile-adaptable-modular-the-future-of-army-c2/ https://federalnewsnetwork.com/army/2024/06/agile-adaptable-modular-the-future-of-army-c2/#respond Tue, 04 Jun 2024 16:34:37 +0000 https://federalnewsnetwork.com/?p=5026837 The Army’s Next Generation Command and Control (NGC2) Capability Characteristics or C2Next is the roadmap for developing a different kind of command post.

The post Agile, adaptable, modular: The future of Army C2 first appeared on Federal News Network.

]]>
var config_5027013 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB9062140394.mp3?updated=1717517527"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Agile, adaptable, modular: The future of Army C2","description":"[hbidcpodcast podcastid='5027013']nnFor the Army, the command post of the future will need to be agile, resilient and intuitive.nnIt will be a big lift not only for the Army, but for the contractors who are building the technology to support it.nnThis is one of many reasons why the Army Chief of Staff Gen. Randy George on May 28 signed off on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).nnThe Army <a href="https:\/\/sam.gov\/opp\/a060a01a72074a7c95dc541f7ae36400\/view" target="_blank" rel="noopener">released a notice<\/a> on SAM.gov to say the characteristics of needs are available, but vendors have to \u201capply\u201d to see them as they are not public.nnGeorge and other Army senior leaders, speaking at the Army TEMS conference in Philadelphia last week, offered a preview of the characteristics, outlining key concepts and insight into what command and control of the future needs to encompass.nnGeorge said with the <a href="https:\/\/federalnewsnetwork.com\/army\/2023\/10\/the-army-has-been-trying-to-simplify-its-networks-for-decades-officials-say-this-time-is-different\/">network being the Army\u2019s top priority<\/a>, these new characteristics are a key building block.nn\u201cI was out at the National Training Center I think it was March for Project Convergence. One of the things that I challenged everybody a year ago, and especially Army Future Command, was I want to be able to be on the network and I want us to be able to operate with tablets, phones, software-defined radios in a very simple architecture. What I saw when I was out there in March is that the technology exists now to do those kinds of things,\u201d George said. \u201cWe had a platoon leader talking to a company commander or talking to a battalion commander talking to a brigade commander, and they were talking on tablets. All those big systems that we used to have, the Advanced Field Artillery Tactical Data System (AFATDS) is one of them, can be an app. It can be on that tablet. So rather than having a truck or two trucks and 10 people, you have an application. That's where we have to go.\u201dnnGeorge said the commanders were excited about these capabilities because it speeds the decision process and makes them more lethal.n<h2>Army details C2 Next<\/h2>nThe Army developed this initial set of C2Next characteristics to support the concepts George talked about: Speed to decision, the lethality of the units, the ability to adapt and be agile based on real-time threats, challenges and needs.nnJoe Welch, the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command, said what\u2019s in the characteristics are not just capabilities to build or have, but they give you the ability to tailor and adapt C2 for the commander and their staff based on their needs and information requirements. He said these characteristics aren't even necessarily the nuts and bolts of the capabilities of systems.nn[caption id="attachment_5026914" align="alignright" width="400"]<img class="wp-image-5026914" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/06\/joe-welch-300x225.webp" alt="" width="400" height="300" \/> Joe Welch is the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command.[\/caption]nnWelch outlined several focus areas for C2 Next, starting with a key ingredient agnostic transport, meaning the data gets to the users no matter the infrastructure such as cloud or satellite or on-premise data center.nn\u201c[It has to be] robust and resilient. We've been making lots of progress in terms of that, not just in the variety of transport paths that we have for our networks to be able to support data transmission, but to do it in an automated way and a highly secure way,\u201d Welch said. \u201cI see this as a continued evolution. In the characteristics of need, we talk specifically about being threat informed in this area. We started from a perspective of, we just need to be able to communicate; we need to be able to get the data where it needs to go in order to accomplish the mission.\u201dnnA second area that will be critical, Welch said, is a robust services architecture that is cloud native and based on open systems standards that let commanders easily iterate new capabilities.nn\u201cA consistent theme here recently is as-a-service. We're seeing that in more and more areas. What's really meant by that is that we don't want to be fixed on any particular thing. We want to be able to experiment, prototype, move very quickly into deployment, and use something as long as it's working, and be able to challenge it when there's something that's better out there when the need changes or the technology changes,\u201d he said. \u201cThat gets into a lot more mechanics than the concepts or the capabilities that we're describing. But it's a very fundamental underpinning of where we're looking to go.\u201dn<h2>Testing C2 characteristics<\/h2>nWelch added C2 Next is part of a necessary and <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/10\/as-the-u-s-military-struggles-some-c2-challenges-for-special-forces\/">complete revamp<\/a> of the way that the Army will generate, produce, consume and discover data, and it's necessary in order to apply machine learning on to it at all.nnHe said if the Army wants to be able to do informed and enabled decision making much faster than the adversary, then the characteristics of need will play a huge role.nnThe Army has been testing many of these concepts over the last 12-18 months and improving them as it went along. Most recently at <a href="https:\/\/federalnewsnetwork.com\/army\/2023\/06\/armys-next-project-convergence-to-be-an-integrated-global-exercise\/">Project Convergence<\/a>, an annual technology and capability demonstration, Dr. Jean Vettel, Next Generation C2 chief scientist for experimentation at Army Capabilities Command for C5ISR, said they measured some of the benefits of the C2Next characteristics.nn\u201cIn the characteristics of need, you'll see that we'll talk a lot about modularity or\u2026this really focus we have on composability. What does that actually mean?\u201d Vettel said.nnAs an example, Vettel said commanders developing a plan to set up a command post in minutes versus hours using commercial technology called Raspberry Pis.nn\u201cWithin that they had 16 Raspberry Pi's that they put out, where they emulated the electromagnetic signature of the command post as decoys. Whenever we think about adaptability, what is the metric of adaptability that would be successful for Next C2?\u201d she said.nnThe idea was to protect the command and control technologies from jamming or other cyber attacks. Vettel said this is an example of how the C2 Next characteristics emphasize adaptability.nn\u201cIf it's adaptable, that means that in the fight, whenever a pure adversary now has identified that we're creating our decoys with electromagnetic signatures, then our warfighters need to have access to data that they couldn't tell us they would need beforehand,\u201d she said. \u201cThey have to have the ability to know what data they have available and how do they try to spoof or create a different decoy because they have access to the data because it's adaptable to what they need to fight the peer adversary.\u201dnnShe added this example also shows building capabilities in modules that can be plugged into, removed and changed as necessary is another key piece in the C2 Next characteristics.n<h2>A living document to be updated<\/h2>nC2Next characteristics are out for review and comment for industry and other folks in the Army.nnWelch said the intent is to make C2 Next characteristics of need a living document that will be updated every six months or so.nnAdditionally, the Army Futures Command is in the early stages of planning a new contract vehicle to help bring these C2 Next characteristics into technology capabilities. While it\u2019s still early, the Army may use an Other Transaction Authority type of approach as a way to bring multiple companies into the mix and experiment with different parts of the characteristics.nn\u201cI think what you'll see is the characteristics of need, which may sound very principled and very large overarching statements, I'm expecting that they're going to get iterated into some greater and greater levels of detail as we continue through Next Generation C2 experimentation,\u201d Welch said. \u201cWe're certainly moving fast and in alignment with the chief\u2019s objective to be moving with speed and urgency. We're going to be moving in conjunction with our partners at Acquisition, Logistics and Technology (ASA(ALT) as we look beyond experimentation and prototyping and into delivery of Next Generation C2 capability.\u201d"}};

For the Army, the command post of the future will need to be agile, resilient and intuitive.

It will be a big lift not only for the Army, but for the contractors who are building the technology to support it.

This is one of many reasons why the Army Chief of Staff Gen. Randy George on May 28 signed off on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).

The Army released a notice on SAM.gov to say the characteristics of needs are available, but vendors have to “apply” to see them as they are not public.

George and other Army senior leaders, speaking at the Army TEMS conference in Philadelphia last week, offered a preview of the characteristics, outlining key concepts and insight into what command and control of the future needs to encompass.

George said with the network being the Army’s top priority, these new characteristics are a key building block.

“I was out at the National Training Center I think it was March for Project Convergence. One of the things that I challenged everybody a year ago, and especially Army Future Command, was I want to be able to be on the network and I want us to be able to operate with tablets, phones, software-defined radios in a very simple architecture. What I saw when I was out there in March is that the technology exists now to do those kinds of things,” George said. “We had a platoon leader talking to a company commander or talking to a battalion commander talking to a brigade commander, and they were talking on tablets. All those big systems that we used to have, the Advanced Field Artillery Tactical Data System (AFATDS) is one of them, can be an app. It can be on that tablet. So rather than having a truck or two trucks and 10 people, you have an application. That’s where we have to go.”

George said the commanders were excited about these capabilities because it speeds the decision process and makes them more lethal.

Army details C2 Next

The Army developed this initial set of C2Next characteristics to support the concepts George talked about: Speed to decision, the lethality of the units, the ability to adapt and be agile based on real-time threats, challenges and needs.

Joe Welch, the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command, said what’s in the characteristics are not just capabilities to build or have, but they give you the ability to tailor and adapt C2 for the commander and their staff based on their needs and information requirements. He said these characteristics aren’t even necessarily the nuts and bolts of the capabilities of systems.

Joe Welch is the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command.

Welch outlined several focus areas for C2 Next, starting with a key ingredient agnostic transport, meaning the data gets to the users no matter the infrastructure such as cloud or satellite or on-premise data center.

“[It has to be] robust and resilient. We’ve been making lots of progress in terms of that, not just in the variety of transport paths that we have for our networks to be able to support data transmission, but to do it in an automated way and a highly secure way,” Welch said. “I see this as a continued evolution. In the characteristics of need, we talk specifically about being threat informed in this area. We started from a perspective of, we just need to be able to communicate; we need to be able to get the data where it needs to go in order to accomplish the mission.”

A second area that will be critical, Welch said, is a robust services architecture that is cloud native and based on open systems standards that let commanders easily iterate new capabilities.

“A consistent theme here recently is as-a-service. We’re seeing that in more and more areas. What’s really meant by that is that we don’t want to be fixed on any particular thing. We want to be able to experiment, prototype, move very quickly into deployment, and use something as long as it’s working, and be able to challenge it when there’s something that’s better out there when the need changes or the technology changes,” he said. “That gets into a lot more mechanics than the concepts or the capabilities that we’re describing. But it’s a very fundamental underpinning of where we’re looking to go.”

Testing C2 characteristics

Welch added C2 Next is part of a necessary and complete revamp of the way that the Army will generate, produce, consume and discover data, and it’s necessary in order to apply machine learning on to it at all.

He said if the Army wants to be able to do informed and enabled decision making much faster than the adversary, then the characteristics of need will play a huge role.

The Army has been testing many of these concepts over the last 12-18 months and improving them as it went along. Most recently at Project Convergence, an annual technology and capability demonstration, Dr. Jean Vettel, Next Generation C2 chief scientist for experimentation at Army Capabilities Command for C5ISR, said they measured some of the benefits of the C2Next characteristics.

“In the characteristics of need, you’ll see that we’ll talk a lot about modularity or…this really focus we have on composability. What does that actually mean?” Vettel said.

As an example, Vettel said commanders developing a plan to set up a command post in minutes versus hours using commercial technology called Raspberry Pis.

“Within that they had 16 Raspberry Pi’s that they put out, where they emulated the electromagnetic signature of the command post as decoys. Whenever we think about adaptability, what is the metric of adaptability that would be successful for Next C2?” she said.

The idea was to protect the command and control technologies from jamming or other cyber attacks. Vettel said this is an example of how the C2 Next characteristics emphasize adaptability.

“If it’s adaptable, that means that in the fight, whenever a pure adversary now has identified that we’re creating our decoys with electromagnetic signatures, then our warfighters need to have access to data that they couldn’t tell us they would need beforehand,” she said. “They have to have the ability to know what data they have available and how do they try to spoof or create a different decoy because they have access to the data because it’s adaptable to what they need to fight the peer adversary.”

She added this example also shows building capabilities in modules that can be plugged into, removed and changed as necessary is another key piece in the C2 Next characteristics.

A living document to be updated

C2Next characteristics are out for review and comment for industry and other folks in the Army.

Welch said the intent is to make C2 Next characteristics of need a living document that will be updated every six months or so.

Additionally, the Army Futures Command is in the early stages of planning a new contract vehicle to help bring these C2 Next characteristics into technology capabilities. While it’s still early, the Army may use an Other Transaction Authority type of approach as a way to bring multiple companies into the mix and experiment with different parts of the characteristics.

“I think what you’ll see is the characteristics of need, which may sound very principled and very large overarching statements, I’m expecting that they’re going to get iterated into some greater and greater levels of detail as we continue through Next Generation C2 experimentation,” Welch said. “We’re certainly moving fast and in alignment with the chief’s objective to be moving with speed and urgency. We’re going to be moving in conjunction with our partners at Acquisition, Logistics and Technology (ASA(ALT) as we look beyond experimentation and prototyping and into delivery of Next Generation C2 capability.”

The post Agile, adaptable, modular: The future of Army C2 first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/army/2024/06/agile-adaptable-modular-the-future-of-army-c2/feed/ 0
Improving citizen experience with proper data management https://federalnewsnetwork.com/commentary/2024/05/improving-citizen-experience-with-proper-data-management/ https://federalnewsnetwork.com/commentary/2024/05/improving-citizen-experience-with-proper-data-management/#respond Fri, 31 May 2024 19:28:36 +0000 https://federalnewsnetwork.com/?p=5022866 By harnessing data-driven decision-making, agencies can significantly enhance the quality and efficiency of services provided to citizens.

The post Improving citizen experience with proper data management first appeared on Federal News Network.

]]>
The White House’s recent FY25 budget proposal emphasizes improved citizen services and quality of life and includes initiatives such as lowering the cost of childcare, increasing affordable housing, decreasing the cost of healthcare and more.  

 To accomplish these goals, the budget proposal focuses on utilizing evidence-driven policies and programs, highlighting the need for additional personnel to collect and analyze evidence, such as data, to properly inform agency initiatives.  

 Most agencies currently collect different types of data, but there is variation in the extent to which it is used to inform decision-making processes. The Office of Management and Budget published an evidence-based policymaking guide to encourage and support agencies in making more data-driven decisions. 

 While this is one of several pieces of support that the federal government has offered agencies, a critical piece is missing from the primary discussion – the role of proper data management and how it can impact citizen services and their experiences. 

Data management for CX success

As federal agencies look to leverage data to inform policies, decisions and programs, they are under-valuing data hygiene, failing to recognize the benefits of processes including lineage and testing protocols. If done incorrectly, the government will fail to meet crucial citizen needs.  

For example, during an analysis of the Internal Revenue Service’s legacy IT, the Government Accountability Office found the agency lacked regular evaluations of customer experiences and needs during the implementation of OMB’s cloud computing strategy. As a result, the agency has spent over a decade trying to replace the legacy Individual Master File (IMF) system, which is the authoritative data source for individual tax account data – this lack of responsiveness to CX needs is compounded with data and other challenges, significantly affecting citizen services.  

To ensure employees understand the true value of data and the benefits it can provide when used correctly, it’s important for agencies to foster a culture of data literacy, or the ability to read, write and communicate data in context. This is a foundational aspect of enhancing government’s data capabilities. 

Data plays a pivotal role in the quality of services provided to citizens. Before it can be used to inform such programs, agencies must ensure their data is organized and accessible according to proper data management protocols. 

Data management is defined as a set of practices, techniques and tools for achieving consistent access to and delivery of data across the spectrum of data subject areas and types in an agency. In the federal government’s case, having access to organized data, regardless of location, provides insights to decision makers that enable them to act according to relevant stats and information. 

This level of insight helps the government greatly when working to meet the requirements needed to correctly inform citizen programs and bolster citizen services, as the process may include migrating large data sets from legacy systems.  

When agencies successfully adhere to proper data hygiene and management, valuable resources for citizen use are made available, ranging from updated payment systems to public safety information such as crime rate data. Once the data has been properly stored and organized, business intelligence and analytics software tools such as ServiceNow or Tableau can help agencies make informed decisions. 

Impact on citizen services

The government provides a variety of services that citizens rely on daily, including health benefits, food assistance, social security and more. But as the economic landscape changes, the government’s citizen services must also change. 

To help individuals and businesses during the COVID-19 pandemic, Congress allotted $2.6 trillion to support vulnerable populations, public health needs and unemployment assistance – when agencies can access readily-available data that has been adequately managed, it makes it easier to provide the services that citizens need in a timely manner. Additionally, by ensuring internal data is ready for use, agencies can provide for all citizens despite factors such as race, location or age. 

Suppose the government decides to increase the amount of food assistance provided across the country and disperses an equal amount to every state without knowing population density, unemployment rates and other essential factors. In that case, they risk significantly decreasing the level of impact of such an initiative. While a simple example, this showcases the importance of data when making decisions that impact the lives of millions of individuals. 

Given the focus of the White House’s FY25 budget proposal, the federal government will see an increased need for proper data management to improve citizen services. Agencies must return to the foundational aspects of data hygiene to be successful. 

By harnessing the power of data-driven decision-making, adopting innovative technologies and fostering a culture of data literacy, agencies can significantly enhance the quality and efficiency of services provided to citizens. 

This transformation not only meets the evolving needs and expectations of the public but also represents a fundamental commitment to transparency, efficiency and accountability in governance. In this digital age, effective data management is not just a strategic asset but a cornerstone of democratic engagement and public trust. 

Laura Stash is executive vice president of solutions architecture at iTech AG. 

The post Improving citizen experience with proper data management first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/05/improving-citizen-experience-with-proper-data-management/feed/ 0
Using data, revolutionary Sammies finalist sees forest for the trees https://federalnewsnetwork.com/technology-main/2024/05/using-data-revolutionary-sammies-finalist-sees-forest-for-the-trees/ https://federalnewsnetwork.com/technology-main/2024/05/using-data-revolutionary-sammies-finalist-sees-forest-for-the-trees/#respond Fri, 31 May 2024 16:44:30 +0000 https://federalnewsnetwork.com/?p=5022679 Robert McGaughey revolutionized the U.S. Forest Service’s ability to visualize aerial surveillance data as useful information.

The post Using data, revolutionary Sammies finalist sees forest for the trees first appeared on Federal News Network.

]]>
var config_5022141 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB7342747831.mp3?updated=1717142025"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Using data, revolutionary Sammies finalist sees forest for the trees","description":"[hbidcpodcast podcastid='5022141']nnWith too much data, you can lose sight of the forest for the trees. <b data-stringify-type="bold"><i data-stringify-type="italic"><a class="c-link" href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/" target="_blank" rel="noopener noreferrer" data-stringify-link="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/" data-sk="tooltip_parent" aria-describedby="sk-tooltip-484">The Federal Drive with Tom Temin<\/a><\/i><\/b> spoke to a guest who revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, <a href="https:\/\/servicetoamericamedals.org\/honorees\/robert-j-mcgaughey\/">he's a finalist<\/a> in this year's Service to America Medals program: Research Forester Robert McGaughey.nn<em><strong>Interview Transcript:<\/strong><\/em>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>With too much data, you can lose sight of the forest for the trees. Well, my next guest revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, he's a finalist in this year's Service to America Medals program. Research forester Robert McGaughey joins me now. Mr. McGaughey, good to have you with us.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Thank you, Tom. Nice to be here.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And you are a programmer of software code that did something to other software to make it usable. Tell us what you've done here.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, technically, I'm trained as a forester, so it's an interesting combination being a programmer and a forester. But the technology that I work with is called airborne lidar. That's light detection and ranging. And the basic idea is you have this laser rangefinder in an aircraft that fires out millions of pulses per second, or a million pulses per second. It gets a measurement of every object that that pulse hits and naturally produces a lot of data. So the software that I developed reduces that data down to more usable products. The software has been around for about 20 years and been used across the country within federal agencies, universities especially, and then around the world as well.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Interesting. So the lidar is then surveillance that is not photography. It's surveillance using this lidar. And what is the output of lidar in its raw form?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>The true raw form are just range measurements from an aircraft. So the distance to an object that's hit by that laser pulse, that's combined with the position and attitude of the aircraft to get an actual XYZ point for every object that's hit. And as I said, it's millions of points over a small area, densities in the ten points per square meter and higher.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>So, in other words, it's a way of representing as data a three-dimensional, potentially three-dimensional image of what you're looking down at with the lidar equipment.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Exactly, exactly. That's it.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And so the output of putting the lidar data through your program, does that result in pictures or visualizations?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Visualization is a big part of it. Just understanding what these systems measure. It's kind of hard to wrap our brains around the complexity that happens, you know, with this laser and the distance and the precise attitude and everything, but just knowing that it hits trees, buildings, the ground especially is important. But we reduce that down to something that's more usable, just a whole bunch of points. It's interesting to look at, but from an information standpoint, you know, our brain processes things really well, and we can see patterns and see objects that we recognize. But getting our computer to see that is a little different.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>So how does this work in a practical sense for the Forest Service? First of all, we'll talk about that context that you developed it for in the first place.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, for us, it was really a research problem to start with, to understand that these systems were even useful for forestry. We knew that they could measure the ground really well, very accurately, but we didn't even know if we would get data from trees. So once we realized that we could get good measurements from trees, we could measure the tree height over very large areas. We can measure variability in that height. We can measure patterns of vegetation. So there are trees in some areas, not trees in others. You know, big trees, little trees. And all that information is useful for making management decisions. It starts to give us information about the size of the trees, potential value, something about the age just because of their size, where there are denser areas of trees that might need to be thinned to encourage growth, or to remove or reduce fuel for fire risk. Things where we might want to go in and plant because there's not enough trees. So all those kind of things are useful information to have over the large areas, and it's wall to wall over the areas covered by this data.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>I suppose it could also tell you reasons to look into further as to why the forest density is higher here, as opposed to there also.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>It could give you some indication. At least it would tell you the areas where you have conditions that maybe you want to know more about. So kind of a reconnaissance before you were to go to the field to do more work.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>We're speaking with Robert McGaughey. He's a research forester with the U.S. Forest Service and a finalist in this year's Service to America Medals program. Okay, so we understand what's happening here from the forestry standpoint for our programing listeners. Then what did you do to convert the point data from the lidar into these visualizations or information that's usable?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, this software, which is called Fusion, because it fuses different sources of data to help you understand what they all measure. As I said, there is a visualization component that just lets you grab small samples of these data and spin them around and look at these point clouds. Really useful for just understanding what's being measured. But probably most useful, there's a whole suite of programing tools or smaller components that allow you to string together commands to take this point cloud and reduce it down to information in a raster form. So that's like little cells of information that have been summarized from the point data. Much smaller, much easier to work with. Typically, in a geographic information system, that data becomes very useful and very digestible to other types of software. So the software that I've developed really takes that raw XYZ point cloud or three-dimensional point cloud, and boils that down into something that's much more usable for a broader audience.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Well, how did people use lidar data before this?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>I started developing software pretty much about the same time lidar came to forestry. So in the late 90s, there were some initial research efforts to understand what the technology could do. I like to joke our first project included 40 million data points and we had no software. There was nothing commercially available that really could handle 40 million points. Today, that's collected in a blink of an eye. I mean, literally, we collect 40 million points in a few seconds.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Right. When the first hard drives came out for PCs in that era, you had to decide, should I get the five megabyte drive or the ten megabyte drive?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Yes. Things have come a long way. Yeah, we just keep adding zeros to the end of those numbers, and we seem to add them pretty quickly.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And as a forester, I mean, you were trained as a forester. That was your career choice. How did you bridge from looking at trees as a forester to looking at trees as a programmer?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, my undergraduate work was in forestry and I did a masters in forestry, but I developed software for my master's work as well. My original work was in timber harvesting. It was in doing the engineering design behind some of the systems used out West. It's one of those things, you know, computer programing, take a couple classes. I had an aptitude for it and I actually really enjoyed it. I always see it as kind of the ultimate engineering thing, where you're building something that accomplishes a task. So a lot of self-taught. The software that I developed is developed in C++.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Yeah, that was my question. Because when you started, I mean, people were still programming point and coordinate data using Fortran and languages in that era.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Yes, that's true. That's true. I actually did some Fortran work at one point, but pretty much self-taught on C and C++. Having probably advanced as far as I, you know, could have in that realm. But my software works. It's very robust. I've had the good fortune of having a partnership with a group in the Forest Service that develops training materials, and so we get a lot of testing and training done through the support from that group as well.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Sure. And now Fusion is in the open source world. How else have people applied it that you're aware of?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, open source is a sticky word today because it implies certain things. It's freely available software. It's not really developed as an open source project where we have multiple contributors, but it's been picked up. The software is distributed as executables, the code is available, but it's been picked up in academia to a large extent and used to help teach students what you can do with wider data. And I see that as one of the most valuable things. It's just we've had a whole set of students go through academia that now know about lidar and how to apply it to natural resource things. It's used internationally. If you do a survey of the literature you'd find, I would say, maybe one in four papers have used the Fusion software to do some of the point cloud processing, partly because it's free, so people can download it and use it, and it really is designed from a forestry perspective rather than a remote sensing or engineering perspective. So, some of the products, in the way that it does things and kind of the terminology and everything are friendly to foresters. So that's really helped it be picked up and used around the world.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And are you still working on it? Do you still work to perfect it?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>I am still working on it, yes. There's actually a release that'll come out here in the next week or two that incorporates a new point cloud format for the data, probably do two to three updates a year or major updates. I haven't really added a lot of new capability for a few years, but we continue to kind of work on how to chew through lots of data faster. These data sets, as I said, are huge. Billions of points, literally, consuming pretty large space on disk drives and that kind of thing. So there's a real focus now on moving things into the cloud and processing that way.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>You've moved from the Fortran world to the Drupal and Kubernetes world of large data sets in the cloud.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>At least as far as large data sets. The software doesn't really deal with things in the cloud. It's really set up for desktop use, and it's all Windows-based, partly because that's what the Forest Service uses for its computing system. So it's always been developed to run on the systems that we have available to us.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Sure. And you are a forester, ultimately. Do you still get out and hug a tree once in a while and not just a keyboard?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>I do. Not as much as I would like to sometimes, but I do still get to go out. We do a lot of field work. I'm also an affiliate instructor at the University of Washington back in Seattle, and I get to work with grad students who have just super interesting ideas and projects that they're working on. So a lot of that is where the field stuff comes in, as you know, working with small areas and working with very specific projects. A lot of the large area work is just kind of recipe-driven. You plug the data in, you chew it through the software. You get a set of layers, and then there's some standard uses that those layers are used for.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Sure. So a lot of people speak for the trees, but in some sense you have enabled the trees to speak to you.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>You could say that, yeah. We've kind of enabled an ability to capture a lot of information that would have taken people weeks, months on the ground.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Robert McGaughey is a research forester with the U.S. Forest Service and a finalist in this year's Service to America Medals program. Thanks so much for joining me.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>You're welcome, Tom. This has been great.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And we'll post this interview along with a link to more information at federalnewsnetwork.com\/federaldrive, where you can find all of our Sammies interviews. Subscribe to the Federal Drive wherever you get your podcasts.<\/p>n "}};

With too much data, you can lose sight of the forest for the trees. The Federal Drive with Tom Temin spoke to a guest who revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, he’s a finalist in this year’s Service to America Medals program: Research Forester Robert McGaughey.

Interview Transcript:

Tom Temin With too much data, you can lose sight of the forest for the trees. Well, my next guest revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, he’s a finalist in this year’s Service to America Medals program. Research forester Robert McGaughey joins me now. Mr. McGaughey, good to have you with us.

Robert McGaughey Thank you, Tom. Nice to be here.

Tom Temin And you are a programmer of software code that did something to other software to make it usable. Tell us what you’ve done here.

Robert McGaughey So, technically, I’m trained as a forester, so it’s an interesting combination being a programmer and a forester. But the technology that I work with is called airborne lidar. That’s light detection and ranging. And the basic idea is you have this laser rangefinder in an aircraft that fires out millions of pulses per second, or a million pulses per second. It gets a measurement of every object that that pulse hits and naturally produces a lot of data. So the software that I developed reduces that data down to more usable products. The software has been around for about 20 years and been used across the country within federal agencies, universities especially, and then around the world as well.

Tom Temin Interesting. So the lidar is then surveillance that is not photography. It’s surveillance using this lidar. And what is the output of lidar in its raw form?

Robert McGaughey The true raw form are just range measurements from an aircraft. So the distance to an object that’s hit by that laser pulse, that’s combined with the position and attitude of the aircraft to get an actual XYZ point for every object that’s hit. And as I said, it’s millions of points over a small area, densities in the ten points per square meter and higher.

Tom Temin So, in other words, it’s a way of representing as data a three-dimensional, potentially three-dimensional image of what you’re looking down at with the lidar equipment.

Robert McGaughey Exactly, exactly. That’s it.

Tom Temin And so the output of putting the lidar data through your program, does that result in pictures or visualizations?

Robert McGaughey Visualization is a big part of it. Just understanding what these systems measure. It’s kind of hard to wrap our brains around the complexity that happens, you know, with this laser and the distance and the precise attitude and everything, but just knowing that it hits trees, buildings, the ground especially is important. But we reduce that down to something that’s more usable, just a whole bunch of points. It’s interesting to look at, but from an information standpoint, you know, our brain processes things really well, and we can see patterns and see objects that we recognize. But getting our computer to see that is a little different.

Tom Temin So how does this work in a practical sense for the Forest Service? First of all, we’ll talk about that context that you developed it for in the first place.

Robert McGaughey So, for us, it was really a research problem to start with, to understand that these systems were even useful for forestry. We knew that they could measure the ground really well, very accurately, but we didn’t even know if we would get data from trees. So once we realized that we could get good measurements from trees, we could measure the tree height over very large areas. We can measure variability in that height. We can measure patterns of vegetation. So there are trees in some areas, not trees in others. You know, big trees, little trees. And all that information is useful for making management decisions. It starts to give us information about the size of the trees, potential value, something about the age just because of their size, where there are denser areas of trees that might need to be thinned to encourage growth, or to remove or reduce fuel for fire risk. Things where we might want to go in and plant because there’s not enough trees. So all those kind of things are useful information to have over the large areas, and it’s wall to wall over the areas covered by this data.

Tom Temin I suppose it could also tell you reasons to look into further as to why the forest density is higher here, as opposed to there also.

Robert McGaughey It could give you some indication. At least it would tell you the areas where you have conditions that maybe you want to know more about. So kind of a reconnaissance before you were to go to the field to do more work.

Tom Temin We’re speaking with Robert McGaughey. He’s a research forester with the U.S. Forest Service and a finalist in this year’s Service to America Medals program. Okay, so we understand what’s happening here from the forestry standpoint for our programing listeners. Then what did you do to convert the point data from the lidar into these visualizations or information that’s usable?

Robert McGaughey So, this software, which is called Fusion, because it fuses different sources of data to help you understand what they all measure. As I said, there is a visualization component that just lets you grab small samples of these data and spin them around and look at these point clouds. Really useful for just understanding what’s being measured. But probably most useful, there’s a whole suite of programing tools or smaller components that allow you to string together commands to take this point cloud and reduce it down to information in a raster form. So that’s like little cells of information that have been summarized from the point data. Much smaller, much easier to work with. Typically, in a geographic information system, that data becomes very useful and very digestible to other types of software. So the software that I’ve developed really takes that raw XYZ point cloud or three-dimensional point cloud, and boils that down into something that’s much more usable for a broader audience.

Tom Temin Well, how did people use lidar data before this?

Robert McGaughey I started developing software pretty much about the same time lidar came to forestry. So in the late 90s, there were some initial research efforts to understand what the technology could do. I like to joke our first project included 40 million data points and we had no software. There was nothing commercially available that really could handle 40 million points. Today, that’s collected in a blink of an eye. I mean, literally, we collect 40 million points in a few seconds.

Tom Temin Right. When the first hard drives came out for PCs in that era, you had to decide, should I get the five megabyte drive or the ten megabyte drive?

Robert McGaughey Yes. Things have come a long way. Yeah, we just keep adding zeros to the end of those numbers, and we seem to add them pretty quickly.

Tom Temin And as a forester, I mean, you were trained as a forester. That was your career choice. How did you bridge from looking at trees as a forester to looking at trees as a programmer?

Robert McGaughey So, my undergraduate work was in forestry and I did a masters in forestry, but I developed software for my master’s work as well. My original work was in timber harvesting. It was in doing the engineering design behind some of the systems used out West. It’s one of those things, you know, computer programing, take a couple classes. I had an aptitude for it and I actually really enjoyed it. I always see it as kind of the ultimate engineering thing, where you’re building something that accomplishes a task. So a lot of self-taught. The software that I developed is developed in C++.

Tom Temin Yeah, that was my question. Because when you started, I mean, people were still programming point and coordinate data using Fortran and languages in that era.

Robert McGaughey Yes, that’s true. That’s true. I actually did some Fortran work at one point, but pretty much self-taught on C and C++. Having probably advanced as far as I, you know, could have in that realm. But my software works. It’s very robust. I’ve had the good fortune of having a partnership with a group in the Forest Service that develops training materials, and so we get a lot of testing and training done through the support from that group as well.

Tom Temin Sure. And now Fusion is in the open source world. How else have people applied it that you’re aware of?

Robert McGaughey So, open source is a sticky word today because it implies certain things. It’s freely available software. It’s not really developed as an open source project where we have multiple contributors, but it’s been picked up. The software is distributed as executables, the code is available, but it’s been picked up in academia to a large extent and used to help teach students what you can do with wider data. And I see that as one of the most valuable things. It’s just we’ve had a whole set of students go through academia that now know about lidar and how to apply it to natural resource things. It’s used internationally. If you do a survey of the literature you’d find, I would say, maybe one in four papers have used the Fusion software to do some of the point cloud processing, partly because it’s free, so people can download it and use it, and it really is designed from a forestry perspective rather than a remote sensing or engineering perspective. So, some of the products, in the way that it does things and kind of the terminology and everything are friendly to foresters. So that’s really helped it be picked up and used around the world.

Tom Temin And are you still working on it? Do you still work to perfect it?

Robert McGaughey I am still working on it, yes. There’s actually a release that’ll come out here in the next week or two that incorporates a new point cloud format for the data, probably do two to three updates a year or major updates. I haven’t really added a lot of new capability for a few years, but we continue to kind of work on how to chew through lots of data faster. These data sets, as I said, are huge. Billions of points, literally, consuming pretty large space on disk drives and that kind of thing. So there’s a real focus now on moving things into the cloud and processing that way.

Tom Temin You’ve moved from the Fortran world to the Drupal and Kubernetes world of large data sets in the cloud.

Robert McGaughey At least as far as large data sets. The software doesn’t really deal with things in the cloud. It’s really set up for desktop use, and it’s all Windows-based, partly because that’s what the Forest Service uses for its computing system. So it’s always been developed to run on the systems that we have available to us.

Tom Temin Sure. And you are a forester, ultimately. Do you still get out and hug a tree once in a while and not just a keyboard?

Robert McGaughey I do. Not as much as I would like to sometimes, but I do still get to go out. We do a lot of field work. I’m also an affiliate instructor at the University of Washington back in Seattle, and I get to work with grad students who have just super interesting ideas and projects that they’re working on. So a lot of that is where the field stuff comes in, as you know, working with small areas and working with very specific projects. A lot of the large area work is just kind of recipe-driven. You plug the data in, you chew it through the software. You get a set of layers, and then there’s some standard uses that those layers are used for.

Tom Temin Sure. So a lot of people speak for the trees, but in some sense you have enabled the trees to speak to you.

Robert McGaughey You could say that, yeah. We’ve kind of enabled an ability to capture a lot of information that would have taken people weeks, months on the ground.

Tom Temin Robert McGaughey is a research forester with the U.S. Forest Service and a finalist in this year’s Service to America Medals program. Thanks so much for joining me.

Robert McGaughey You’re welcome, Tom. This has been great.

Tom Temin And we’ll post this interview along with a link to more information at federalnewsnetwork.com/federaldrive, where you can find all of our Sammies interviews. Subscribe to the Federal Drive wherever you get your podcasts.

 

The post Using data, revolutionary Sammies finalist sees forest for the trees first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/technology-main/2024/05/using-data-revolutionary-sammies-finalist-sees-forest-for-the-trees/feed/ 0
How to manage the digital records deadline https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/ https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/#respond Thu, 30 May 2024 21:06:25 +0000 https://federalnewsnetwork.com/?p=5021262 June 30th deadline approaches, when NARA will only accept digitized documents. Agencies must deal with the largest volume known as modern textual records.

The post How to manage the digital records deadline first appeared on Federal News Network.

]]>

Even since people applied ink to parchment, preserving records has posed challenges. Now federal agencies face a June 30th deadline to digitize certain federal records. The National Archives and Records Administration will require agencies to submit the digitized versions, including metadata for future accessibility. Agencies are moreover obligated to conform to NARA standards in carrying out digitization.

Long in the making and several times delayed, the digital requirement stems ultimately from the never-ending growth in annual production of paper records and resulting volume.

“There’s hundreds of millions of dollars being spent every year by federal agencies to create, manage and store these hardcopy records,” said Anthony Massey, strategic business developer at Canon on Federal Insights Records Management. The digitization directive, Massey said, is designed to make archiving easier and less costly while making records themselves more accessible.

The various types of documents – maps, photographs, items deemed culturally significant, and 8.5 x 11-inch bureaucratic output each have their own associated standards and require different technologies to achieve digitization, Massey noted. Helping inform NARA standards making have been guidelines from the Federal Agencies Digital Guidelines Initiative, or FADGI.

The initiative got underway about 10 years ago “as a concept of how to begin to guide agencies into what kind of a digitization format they could then roadmap their policy and procedure to,” Massey said.

Many digitizing procedures incorporate scanning. Scanning itself has continually advanced, said Tae Chong, Canon’s manager of new business development. One development especially relates to a type of document known as a modern textual record (MTR).

An MTR typically was created electronically, perhaps with a modern word processing program or – as is often the case with older records about to leave agency possession and move to NARA – in a program whose technical format is no longer extant.

That means digitizing a paper printout using scanning. Now, Chong said, scanning technology includes “software engineering techniques to tell the text from the background and … special software image processing to essentially enhance the visibility of the text element, while erasing unwanted graphics on the background.”

A second element in state-of-the-art scanning, Chong said, encompasses optical character recognition that “can kick in to pick up the text information and pass it to a software application which will then index the document for later search and retrieval.”

He noted that agencies must also by law preserve a paper copy. But by extracting the information and indexing it, public retrieval and viewing will no longer require handling the paper itself.

“Thie new regulatory requirement is focusing on creating a digital replica of the paper originals,” Chong said.

Special breed

MTRs differ from cultural heritage documents. In the latter type, the entire area of the document encompasses information to preserve; for example, pieces of artwork or hand-lettered manuscripts. OCR technology won’t yield much information, and the background requires preservation along with whatever else the document exhibits.

“When NARA and the working group of FADGI began to establish classifications of imaging for digitizing these various types of records,” Massey said, “they discovered in that particular context of the printed record, there was a need to get a special type of digitization process called MTR that was simpler, less involved with much less expensive equipment that could do a very high quality image and make it transportable into an archive.”

Because the MTRs exist nearly universally as printed on standard office paper, agencies can apply high speed scanning techniques to them. Massey said agencies have produced billions of MTRs, printing them out as either temporary or permanent records.

For such documents, Massey said, NARA wants in on-line catalog. A researcher with a particular topic “can go to a Library of Congress online catalog and look up that document, instead of having to go in person to a particular storage site or physically go and handle that document.”

While MTR is a process or image standard and not a hardware standard, Massey said Canon has developed scanners specifically for MTR.

“The hardware must then be aligned to those scanning requirements,” he said.

For practical purposes, speed is an important requirement for MTR scanners. Massey said the faster the process occurs, the faster agencies can clear back file projects for older records. For new records, he said agencies should consider establish in-house capability to scan and index records as they create them.

“When records management officers look at day-forward scanning,” Massey said, “knowing that from that day forward they also have to digitize these records, they want access to equipment that can do that at a setting that is confidently MTR capable.”

Listen to the full show:

The post How to manage the digital records deadline first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/feed/ 0
NARA’s looming digitization deadline for agencies means the end of paper https://federalnewsnetwork.com/federal-newscast/2024/05/naras-looming-digitization-deadline-for-agencies-means-the-end-of-paper/ https://federalnewsnetwork.com/federal-newscast/2024/05/naras-looming-digitization-deadline-for-agencies-means-the-end-of-paper/#respond Wed, 29 May 2024 15:30:29 +0000 https://federalnewsnetwork.com/?p=5019103 The National Archives and Records Administration is preparing agencies for the paper cut.

The post NARA’s looming digitization deadline for agencies means the end of paper first appeared on Federal News Network.

]]>
var config_5019101 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3058441234.mp3?updated=1716984269"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedNewscast1500-150x150.jpg","title":"NARA’s looming digitization deadline for agencies, means the end of paper","description":"[hbidcpodcast podcastid='5019101']nn[federal_newscast]"}};
  • The National Archives and Records Administration is preparing agencies for a looming digitization deadline. Starting on July 1, NARA will stop accepting paper records from agencies. Now, the Archives has a new website detailing the metadata requirements for a wide variety of electronic records. The goal is to ensure agencies are formatting their electronic records correctly ahead of the July 1 deadline. NARA said metadata is a key piece of preserving and providing access to the federal government's records.
  • A new request for proposal from the Centers for Medicare and Medicaid Services (CMS) has raised alarm bells among services contractors. The RFP seeks to recompete a 10-year contract to operate call centers, now in its third year. CMS wants to force bidders to establish union contracts with labor harmony agreements that bar strikes. The contract is now held by Maximus, which has 10,000 employees operating CMS call centers. Industry sources said the contract garners a 95% positive customer satisfaction rating and has had no labor issues. Professional Services Council CEO David Berteau said the labor requirement is unprecedented in such contracts and may violate the National Labor Relations Act. Proposals are due by the end of June.
    (HHS request for proposal has industry up in arms - The Federal Drive with Tom Temin)
  • One of the DoD’s top IT acquisition executives is departing federal service after a long career. Ruth Youngs Lew, the Navy’s program executive officer for digital and enterprise services, said she is retiring at the end of this week. She has led PEO Digital for the past seven years. Before that, as part of a 30-year career, she was the CIO for the Navy’s Pacific Fleet.
    (Retirement after 31 years - Ruth Youngs Lew via LinkedIn)
  • The Office of Personnel Management is working to address a recent spike in fraudulent activity. Several hundred federal employees in OPM’s flexible spending account program, FSAFEDS, are seeing fraudulent charges on their accounts. Scammers have used employees’ personal information to create fake accounts, or make false reimbursement claims. OPM, along with the program’s vendor HealthEquity, are working to secure impacted accounts and implement additional anti-fraud controls. OPM will also reimburse in full any affected employees.
  • Veterans are giving record-high trust scores to the Department of Veterans Affairs, reaching an an all-time high of more than 80%. That is based on feedback from more than 38,000 veterans who received VA services between January and March this year. VA’s current trust scores are 25% higher than when it first conducted its veteran trust survey in 2016. VA Secretary Denis McDonough said the department’s workforce is committed to delivering a high level of care to veterans across all service areas. “We strive to be an agency that fits our programs into veterans’ lives," McDonough said.
  • The State Department’s Bureau of Intelligence and Research plans to invest more in open-source data. INR’s new open-source intelligence strategy calls for the bureau to meet the rising demand for OSINT from State Department employees across the world. “When it comes to the future of OSINT, the stakes could not be higher," Assistant Secretary of State for Intelligence and Research Brett Holmgren said in an interview. Holmgren also said INR needs to harness a growing body of open information about world events. And he thinks generative AI could help the bureau sift through all that data to deliver more unclassified intelligence assessments.
  • A massive bipartisan bill is looking to make a lot of changes at the Department of Veterans Affairs. But veteran service organizations said they are concerned the bill does not have enough support to make it through Congress. Top lawmakers from the House and Senate VA Committees are backing the Senator Elizabeth Dole 21st Century Veterans Healthcare and Benefits Improvement Act. But veteran groups said the bill failed to reach a House floor vote before Memorial Day, falling short of expectations. Among its changes, the sweeping bill would give the VA additional pay flexibilities for its workforce. It would also set new requirements for VA to resume the rollout of its new Electronic Health Record.
  • The Pentagon’s Chief Digital and Artificial Intelligence Office (CDAO) is getting four new leaders to help accelerate innovation across the department. Garrett Berntsen will join the organization as the deputy CDAO for mission analytics. Berntsen previously served as the State Department’s first deputy chief data and AI officer, where he stood up the State Department’s Center for Analytics. Eugene Kuznetsov will join the CDAO as the deputy for enterprise platforms and services. Jock Padgett will step into his role as the deputy CDAO for advanced C2 acceleration. Christopher Skaluba will be the CDAO’s executive director.
  • As Election Day approaches, agencies should make sure that any job appointments or awards they give out are free from political influence. In a recent reminder, the Office of Personnel Management details how and where agencies should pay attention to keep within the guidelines. For one, agencies need to get OPM approval before moving a political appointee to a non-political position. It is a practice commonly known as “burrowing.” Agencies also cannot hand out pay bonuses or extra time off to politically appointed senior officials until after January 20, 2025.
  • The Defense Innovation Unit is seeking a cross-domain cloud-based information technology capability to make sense of big data for biodefense purposes. The system will be focused on automating anticipatory analysis for biological and health-related issues while also providing situational awareness for all levels of command. The DIU wants this system to be enabled by artificial intelligence and machine learning. The new technical solutions will work seamlessly with various DoD systems and capabilities, including the Combined Joint All Domain Command and Control initiative. Responses are due by June 7.

The post NARA’s looming digitization deadline for agencies means the end of paper first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-newscast/2024/05/naras-looming-digitization-deadline-for-agencies-means-the-end-of-paper/feed/ 0
The Marine Corps’ plan to further breakdown data siloes https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/ https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/#respond Fri, 24 May 2024 16:44:13 +0000 https://federalnewsnetwork.com/?p=5014286 Dr. Colin Crosby, the service data officer for the Marine Corps, said the first test of the API connection tool will use “dummy” logistics data.

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
var config_5014343 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2238077517.mp3?updated=1716568461"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"The Marine Corps\u2019 plan to further breakdown data siloes","description":"[hbidcpodcast podcastid='5014343']nnThe Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.nnAs part of <a href="https:\/\/www.mca-marines.org\/gazette\/fighting-smart\/#:~:text=Fighting%20Smart%20is%20a%20way,and%20combined%20arms%20more%20effective." target="_blank" rel="noopener">its goal<\/a> to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.nn\u201cReally over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,\u201d said Dr. Colin Crosby, the service data officer for the Marine Corps, on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe're working with what we call the functional area managers and their leads on the data that they own because this is all new in how we're operating. I need them to help me execute this agenda so that we can then create that API connection.\u201dnnLike many organizations, <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/03\/dod-cloud-exchange-renata-spinks-on-usmcs-acceleration-to-the-cloud\/">mission areas<\/a> own and manage the data, but sharing because of culture, technology and\/or policy can be difficult.nnCrosby said the API connection can help overcome <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2023\/04\/why-the-marine-corps-has-established-its-own-software-factory\/">many of these challenges<\/a>.nn\u201cOur first marker is to have a working API connection on test data. Once that happens, then we're going to start accelerating the work that we're doing,\u201d he said. \u201cWe're using logistics data so what we're doing is using a dummy data, and we're going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the\u00a0 online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.\u201dn<h2>Testing the API before production<\/h2>nOnce the API connection proves out, Crosby said the goal is to push data into the Marine Corps\u2019 Bolt platform, which runs on the Advana Jupiter platform.nnHe said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.nn\u201cAs we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, \u2018we want to be a part of this,\u2019\u201d Crosby said. \u201cThe training and education command is ready to go. So we're excited about it because now I don't have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.\u201dnnCrosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.nnWithout these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.nn\u201cWith the API, we're going to near-real time type of pull and push, which is speeding up the decision cycle,\u201d he said. \u201cThen there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it'd be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.\u201dnnThe API connection tool is one piece to the bigger Marine Corps effort to create an <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2022\/10\/as-data-fabric-comes-together-army-must-ensure-platforms-integrate\/">integrated mission and data fabric<\/a>. Crosby said that initiative also relies on the unification of the Marine Corps <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2024\/03\/how-the-marines-corps-got-ahead-of-the-zero-trust-curve\/">enterprise network<\/a> to bring the business side and the tactical side together into one environment.nn\u201cThe fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it\u2019s in, regardless of whatever database structure the data resides in,\u201d Crosby said. \u201cIt allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we've never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.\u201d"}};

The Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.

As part of its goal to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.

“Really over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,” said Dr. Colin Crosby, the service data officer for the Marine Corps, on Ask the CIO. “We’re working with what we call the functional area managers and their leads on the data that they own because this is all new in how we’re operating. I need them to help me execute this agenda so that we can then create that API connection.”

Like many organizations, mission areas own and manage the data, but sharing because of culture, technology and/or policy can be difficult.

Crosby said the API connection can help overcome many of these challenges.

“Our first marker is to have a working API connection on test data. Once that happens, then we’re going to start accelerating the work that we’re doing,” he said. “We’re using logistics data so what we’re doing is using a dummy data, and we’re going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the  online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.”

Testing the API before production

Once the API connection proves out, Crosby said the goal is to push data into the Marine Corps’ Bolt platform, which runs on the Advana Jupiter platform.

He said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.

“As we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, ‘we want to be a part of this,’” Crosby said. “The training and education command is ready to go. So we’re excited about it because now I don’t have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.”

Crosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.

Without these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.

“With the API, we’re going to near-real time type of pull and push, which is speeding up the decision cycle,” he said. “Then there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it’d be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.”

The API connection tool is one piece to the bigger Marine Corps effort to create an integrated mission and data fabric. Crosby said that initiative also relies on the unification of the Marine Corps enterprise network to bring the business side and the tactical side together into one environment.

“The fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it’s in, regardless of whatever database structure the data resides in,” Crosby said. “It allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we’ve never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.”

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/feed/ 0
Platform One looks to enhance, build on software factory services https://federalnewsnetwork.com/technology-main/2024/05/platform-one-looks-to-enhance-build-on-software-factory-services/ https://federalnewsnetwork.com/technology-main/2024/05/platform-one-looks-to-enhance-build-on-software-factory-services/#respond Wed, 22 May 2024 02:04:14 +0000 https://federalnewsnetwork.com/?p=5010544 The Air Force’s Platform One is accelerating modern software development for the Defense Department.

The post Platform One looks to enhance, build on software factory services first appeared on Federal News Network.

]]>

The Air Force’s Platform One program has a well established role in bringing “DevSecOps” software development to the Defense Department.

Now the program is focusing on enhancing its existing services, while expanding its secure software development work into more sensitive data environments.

Platform One’s core offerings include “Iron Bank,” a secure repository of hardened container images. Maj. Matthew Jordan, chief of product for Platform One, describes Iron Bank as the “Lego bricks” needed to build modern software. That includes hardened applications, continuous monitoring, vulnerability scanning and regular updates.

“We ensure that we’re patched within our repository, and all of our downstream consumers are able to easily receive the cybersecurity benefits,” Jordan said on Federal News Network. “You’re getting a lot of economies of scale there.”

Iron Bank is primarily accredited for less sensitive unclassified information. But Jordan said Platform One is working to get Iron Bank accredited for controlled unclassified information (CUI) as well as for classified information.

That work is detailed in Platform One’s new product roadmap, which lays out the program’s plans for various offerings and services, including “Big Bang,” its continuous integration and continuous delivery/deployment (CI/CD) platform, and the “Party Bus” platform-as-a-service.

Platform One’s zero trust approach

Platform One also provides a “cloud native access point (CNAP),”  for accessing the software factory’s various services in a secure manner. Jordan said CNAP was “borne out of necessity” in the early days of Platform One, as it sought to work with software vendors, including nontraditional defense vendors, to establish its agile software development platform.

“How do you ensure that you’re still being secure and accessing things that may be coming from the Internet, or via contractor’s workplace or from their home as opposed to in a secure facility on a base?” Jordan explained. “So CNAP allows you to do the device compliance checks, so that you get a lot of attributes about the device itself, as well as understand who the user is, and get a lot of attributes on that user, and then make risk decisions as to ‘Okay, based on what we’re seeing today, you only get access to a certain subset of resources.’”

The capability allows each application owner behind the access point to “set their policies dynamically and make informed risk decisions, or accept the risk if they’re willing to, or mitigate the risk that they want to,” Jordan added.

CNAP is a key piece of Platform One’s zero trust security architecture, which also includes macro segmentation using a software-defined perimeter, Jordan said. Internally, Platform One also uses service meshes to ensure segmentation between individual applications, as well as continuous logging and monitoring.

“And that runtime security so that you can understand when something is going wrong or something’s attempting to do something that it shouldn’t, and then dive deeper for that root cause analysis,” Jordan said.

Platform One is also collaborating with other Air Force organizations on an application programming interface (API) reference architecture document. Jordan said that document is currently in draft.

“Data is king, and it’s crucial that we don’t allow data to just be put into a silo,” he said. “We need to be able to share that data. And API is definitely one way to enable that data flow. So we need to focus on providing those standards for application programming interfaces, software, development kits, data fabrics, all that kind of stuff to the developers. So they can quickly focus on developing features for their applications, as opposed to focusing on interfaces.”

The post Platform One looks to enhance, build on software factory services first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/technology-main/2024/05/platform-one-looks-to-enhance-build-on-software-factory-services/feed/ 0
DoD’s former chief digital and AI officer heads to private sector https://federalnewsnetwork.com/federal-newscast/2024/05/dods-former-chief-digital-and-ai-officer-heads-to-private-sector/ https://federalnewsnetwork.com/federal-newscast/2024/05/dods-former-chief-digital-and-ai-officer-heads-to-private-sector/#respond Tue, 14 May 2024 13:31:55 +0000 https://federalnewsnetwork.com/?p=5000347 Craig Martell, whose outside-government gigs have been with LinkedIn, Dropbox and Lyft, is joining Cohesity as its chief technology officer.

The post DoD’s former chief digital and AI officer heads to private sector first appeared on Federal News Network.

]]>
var config_5000346 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3445167354.mp3?updated=1715687200"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedNewscast1500-150x150.jpg","title":"DoD’s former chief digital and AI officer heads to private sector","description":"[hbidcpodcast podcastid='5000346']nn[federal_newscast]"}};
  • DoD's former chief digital and AI officer has a new job in the private sector. Craig Martell, who served as the Defense Department's first chief digital and artificial intelligence officer for almost two years, is joining Cohesity as its chief technology officer. In that new role, Martell will seek to accelerate the innovation internally and the advocacy externally of Cohesity’s AI-powered tools and capabilities to improve the use of enterprise data. Martell came to DoD in 2022 after spending most of his career in the private sector with LinkedIn, Dropbox and Lyft. Martell left DoD in March and Radha Plum, the former deputy undersecretary of Defense for acquisition and sustainment, assumed the CDAO role in early April.
  • A draft version of the House defense policy bill would raise junior enlisted pay by 15%. Members of the House Armed Services Committee want to give enlisted troops ranked E-1 through E-4 a 15% raise. Last month, the committee introduced the Servicemember Quality of Life Improvement Act to address recommendations made by the House quality of life panel. The legislation is meant to serve as the foundation for all quality of life issues in the 2025 defense policy bill. The Defense Department is also in the middle of its Quadrennial Review of Military Compensation, which will impact lawmakers’ final decision. President Joe Biden’s 2025 budget proposal also includes a 4.5% raise for all service members.
  • The Postal Service is pausing some of its facility changes until next year. USPS is looking at 60 of its mail processing facilities and considering whether to move some operations to larger regional hubs. But Postmaster General Louis DeJoy said the agency will pause these plans until at least January 2025. That is because more than a quarter of the Senate recently called on USPS to pause these changes until a third-party regulator can weigh-in on the merits of its network modernization plan. USPS said its reviews will not result in facility closures or career employee layoffs.
  • The Postal Service is looking to raise prices on more than just mail. USPS is asking its regulator for a 25% increase on the rates it charges for Parcel Select, a package service that caters to high-volume shippers. USPS said it does not plan to raise prices on consumer-focused package services, such as Ground Advantage, Priority Mail and Priority Mail Express. But in July, USPS is planning on raising the price of its first-class Forever stamp to 73 cents.
  • The Army Software Factory program is expanding to add a new chief learning officer (CLO). The Army Software Factory describes itself as an Army Futures Command unit that enables soldiers to become software professionals. The CLO will oversee the learning and development initiatives for the entire organization. The person will also lead the development of the software factory's organizational, programmatic, operational and policy matters pertaining to training programs, strategic initiatives and activities. Applications for this Austin, Texas-based GS-14 position, which is open to the public, are due by May 17.
  • Joint Force Headquarters-DoD Information Networks has completed "Locked Shields 2024," its largest cyber exercise. Locked Shields focuses on cyber attacks on critical infrastructure in real time. The exercise brought together 3,500 participants from 40 countries. This year’s Locked Shields tested out artificial intelligence and 5G technologies. The Defense Department plans to operationalize lessons-learned during the exercise through the Rockville, Maryland-based National Cybersecurity Center of Excellence, in partnership with Marshall University and West Virginia University.
  • Data experts across the federal government are setting shared goals. The Chief Data Officers Council is calling on its members to make the federal workforce more data-savvy and make sure agencies are able to hire the data experts they need. The council is also focused on making agency data sets easier to share with top users. Chief data officers are planning to help their agencies prepare for a rise in artificial intelligence tools. The council is looking to make progress on these goals by the end of fiscal 2025.
    (About Us - CDO Council)
  • The House Armed Services Committee wants to modernize the Defense Department’s processes to grant a cyber authority to operate (ATO). If passed, the 2025 defense policy bill would require DoD to establish and regularly update a digital directory of all authorizing officials in the military departments. It would also require the military service's chief information officers to implement a policy requiring authorizing officials to presume a platform is secure if it has already been accredited by another military service. Lawmakers, defense officials and industry partners have long said lengthy ATO processes are slowing down software development within the DoD.
  • The Cybersecurity and Infrastructure Security Agency is adding more depth and details to the common vulnerabilities it provides public and private sector organizations. A new effort called the "Vulnrichment," will add details such as Common Platform Enumeration, a Common Vulnerability Scoring System, Common Weakness Enumeration and Known Exploited Vulnerabilities to its Common Vulnerabilities and Exposures (CVEs). So far, CISA said it has enriched more than 1,300 CVEs and will continue to add more details in the coming weeks to the other CVEs. CISA is listing all this new data on its  Vulnrichment GitHub Repository.

The post DoD’s former chief digital and AI officer heads to private sector first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-newscast/2024/05/dods-former-chief-digital-and-ai-officer-heads-to-private-sector/feed/ 0
Is your position classification system stifling your agency’s recruitment and hiring? https://federalnewsnetwork.com/federal-insights/2024/05/is-your-position-classification-system-stifling-your-agencys-recruitment-and-hiring/ https://federalnewsnetwork.com/federal-insights/2024/05/is-your-position-classification-system-stifling-your-agencys-recruitment-and-hiring/#respond Thu, 09 May 2024 19:18:27 +0000 https://federalnewsnetwork.com/?p=4995362 Despite progress, there is still one overlooked opportunity for workforce modernization: position classification.

The post Is your position classification system stifling your agency’s recruitment and hiring? first appeared on Federal News Network.

]]>
This content is sponsored by Monster Government Solutions.

Federal human capital has been on the Government Accountability Office’s high risk list for two decades now, but the urgency of this issue has escalated in more recent years. Despite broad progress, there still remains one often overlooked opportunity for workforce modernization foundational to a federal agency’s recruitment and hiring: position classification.

Defining position duties and requirements, and classifying those positions to establish titles, series and grades, can improve pay equity and make agencies more attractive workplaces. A good position classification program can also empower an agency’s ability to hire the workforce it needs by helping to fill talent gaps with targeted recruitment and skills-based hiring. Creating accurate and appealing position description (PD) packages allows agencies to perform a job analysis that identifies skills to assess and create more targeted vacancy announcements. Having clear and accurate PD packages is a common best practice to build trust with employees, manage expectations of job duties, and enable performance reviews, which all support employee satisfaction and retention in the long run.

And position classification doesn’t just impact the workforce – it affects agencies in other ways, too. Overpaying two employees due to one misclassified position may not seem too detrimental, but it can quickly add up.

“Let’s say an agency misclassified 5% of its 2,000 Grade 12 IT specialists, who get paid $14,000 more than Grade 11. That means the agency would be overpaying by $1.4 million for just one grade-level discrepancy for this one job title,” said Jennifer Forrest, senior director of professional services for Monster Government Solutions.

Downstream, misclassification can have compounding impacts on incentives, bonuses and career mobility, which all impact an agency’s ability to retain its employees.

Collaborative, organized and secure classification

Position classification has always demanded attentive and effective engagement between classifiers, hiring managers and other HR stakeholders, but with today’s hybrid and under-resourced workforce, the ability to collaborate quickly, easily, remotely and securely is even more challenging.

“Right now, the main tools many agencies use are emails and shared drives. While these can be used remotely and have some search capabilities, they are extremely limited in their ability to contribute to a transparent, collaborative, and secure classification experience,” said Forrest.

A modern, flexible and centralized system can facilitate more effective data-sharing and collaboration, streamline and guide classification activities, and track requests and handoffs between team members. Most importantly, agencies have the assurances of the most advanced federal security regulations, as commercial, cloud-based solutions for the federal government are now required to have FedRAMP authorization.

Data-powered classification, compliance and auditability

Agencies can leverage invaluable proprietary data from their own existing classification materials, but are they taking advantage?

“Instead of managing 40,000 individual PD files, agencies could search through a virtual library of 4,000, or even 400, standard PDs to repurpose the most relevant one,” said Forrest.

In addition to storing documents, modern classification tools, like intelligent PD builders, can rely on an agency’s PD library and its data to guide users in rapidly and consistently creating accurate and compliant quality documents.

Good classification technology is embedded with advanced data collection and reporting capabilities for data-driven insights and easier audits for compliance with federal regulators, including the Office of Personnel Management.

“Everything within the classification system should be tracked so that it’s easy to go back and understand decisions that were made,” Forrest said. “Enforcing consistency, compliance and equity requires auditability of who’s made what changes, who’s approved to sign off, and which additional team members have accountability for the finalization.”

Invest in planning for a great modernization

Modernization efforts can seem daunting, especially if an agency is still dealing with paper files. Every agency has their own talent needs, security requirements, policies, processes and resources that need to be considered during a modernization. However, this groundwork of gathering requirements will empower agencies to implement the classification program that works best for them and provides the most value for their investments.

“One of the biggest hurdles in modernizing a classification system is getting the data into the new system itself. Agencies must consider data migrations up front in their planning. There’s lots of existing data, and not all of it is good data,” Forrest said. “What many are wondering is how artificial intelligence can help.”

AI, especially generative AI, is garnering a lot of attention within federal hiring, but it requires meticulous sourcing and can bring its own risks, like implicit bias. One opportunity is to use AI for activities that are less risky for government agencies – for example, data migrations.

“MonsterGov is currently working with federal agencies to use a new AI-powered tool to accelerate PD migration in support of their Monster Position Classification implementations,” Forrest said. “An AI-enabled migration cuts down on time, streamlines the data, minimizes errors, and enables the agency to securely store quality PD data in their FedRAMP-secure environment.”

Behind-the-scenes, machine learning helps improve the migration process itself, so not only will agencies have a jumpstart with their data migration but can also expect to optimize their classification program for even better results over time.

Position classification isn’t just something to check-off as “done” to ensure regulators and agency leadership are satisfied. While it can seem complex, it provides significant benefits that empower agencies to attract the right job seekers, effectively fill talent gaps, and retain the employees they worked so diligently to capture. A modern and effective classification system is a powerful avenue to building a fairly rewarded and equitable workforce, an increasingly important factor in hiring the talent agencies need to reach mission-success.

Monster Government Solutions is the leading commercial innovator of federal recruitment and talent acquisition solutions that empower federal agencies to find, hire, and onboard high-quality talent. For more information visit monstergov.com.

The post Is your position classification system stifling your agency’s recruitment and hiring? first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/is-your-position-classification-system-stifling-your-agencys-recruitment-and-hiring/feed/ 0
Industry Exchange Data 2024: TVAR’s Sam O’Daniel, Dell EMC’s Ed Krejcik on managing boom in AI-driven storage demand https://federalnewsnetwork.com/big-data/2024/05/industry-exchange-data-2024-tvars-sam-odaniel-dell-emcs-ed-krejcik-on-managing-boom-in-ai-driven-storage-demand/ https://federalnewsnetwork.com/big-data/2024/05/industry-exchange-data-2024-tvars-sam-odaniel-dell-emcs-ed-krejcik-on-managing-boom-in-ai-driven-storage-demand/#respond Wed, 08 May 2024 15:06:55 +0000 https://federalnewsnetwork.com/?p=4993268 Spiraling growth of unstructured data and its used in training large language models for AI uses require a well-crafted hybrid cloud storage architecture.

The post Industry Exchange Data 2024: TVAR’s Sam O’Daniel, Dell EMC’s Ed Krejcik on managing boom in AI-driven storage demand first appeared on Federal News Network.

]]>

As the importance of data grows in the artificial intelligence era, organizations must pay attention to the growth in unstructured data, not just data that fits in rows and columns.

“We’re seeing unprecedented growth in unstructured data,” said Ed Krejcik, senior manager of unstructured data solutions engineering for federal at Dell EMC.

“Just over the last couple of years, it’s grown from being 80% of all data created to now being 90%,” Krejcik added during Federal News Network’s Industry Exchange Data 2024.

He said unstructured data comes in many forms including satellite and medical imagery, video streams, weather models and data produced by Internet of Things sensors.

As IT organizations seek to establish greater edge computing capability, it’s important to understand that a great deal of unstructured data originates at the tactical edge, said Sam O’Daniel, president and CEO of TVAR Solutions.

In the Defense domain, sensors and end users both produce unstructured data with tactical importance, he said.

“On the scientific research side — from health care research, atmospheric research, space research, environmental research — all of the data is coming from these different sensors and components as part of this unstructured data growth,” O’Daniel said. “And it’s continuing to scale.”

Better storage helps in use of unstructured data

Storage and storage management becomes the first order of business in deriving value from unstructured data, Krejcik and O’Daniel said.

“Because we’re seeing such exponential growth in unstructured data, we recommend a scale-out type of approach to storing the data because it’s a lot more nimble. It’s very easy to manage, and it scales easily,” Krejcik said.

By scale-out, as opposed to scale-up, he meant adding storage resources as needed rather than extending the architecture to more computing servers.

“We can just add in capacity and performance as needed,” Krejcik said. “We can just keep building on that that storage infrastructure without any downtime, without any loss of access to the user data.”

He advised agencies optimize the balance between on-premise and cloud data storage. The federal government’s cloud-smart strategy “forced everyone to evaluate the value of the data, the performance requirements, the latency requirements.”

In a typical scenario, he said an application would access critical data stored locally, in the data center, to experience the least latency. This approach also minimizes data egress costs of cloud storage.

O’Daniel added, “The ability to create a true hybrid cloud strategy is very important when it comes to data, just being able to ensure that the users have that direct access locally.” He said this is especially true in research, where rapid access to data often is necessary.

Under the earlier cloud-first strategy, agency IT staffs “quickly realized that it was not cost effective, essentially blowing through budgets that were supposed to last a few years in a few months,” O’Daniel said.

How to manage AI training, integrate historical data

For the application of data to AI training, “a big part of large language models really comes down to the analysis and inferencing of historical data,” O’Daniel said. Historical data “is going to only create better information coming from those large language models.”

Such training doesn’t happen by serendipity though. Krejcik said it requires an iterative process starting with data scientists to acquire the data. Training data itself needs preparation and staging on an infrastructure designed to support training, validation and deployment. A well-crafted hybrid storage strategy, he said, will ensure efficient training ingest of perhaps billions of data points.

“And finally, retention,” Krejcik said, “Each one of those steps or pillars of the process require different types of performance and capacity.” This is where the scale-up architecture with specific tiers of storage works well, he said.

Because data likely comes from a variety of in-house and external sources, “all that data needs to be packaged up in such a way that it can be put into that large language model.”

Discover more tips and advices shared during Industry Exchange Data 2024 now.

The post Industry Exchange Data 2024: TVAR’s Sam O’Daniel, Dell EMC’s Ed Krejcik on managing boom in AI-driven storage demand first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/big-data/2024/05/industry-exchange-data-2024-tvars-sam-odaniel-dell-emcs-ed-krejcik-on-managing-boom-in-ai-driven-storage-demand/feed/ 0
Industry Exchange Data 2024: Commvault, AWS and Kelyn experts on tuning up CX through smart data management https://federalnewsnetwork.com/federal-insights/2024/05/industry-exchange-data-2024-commvault-aws-and-kelyn-experts-on-tuning-up-cx-through-smart-data-management/ https://federalnewsnetwork.com/federal-insights/2024/05/industry-exchange-data-2024-commvault-aws-and-kelyn-experts-on-tuning-up-cx-through-smart-data-management/#respond Mon, 06 May 2024 11:58:26 +0000 https://federalnewsnetwork.com/?p=4989823 Modern applications that bring convenience to users require modernizing infrastructure to optimize data management, resilience and elasticity, panelists say.

The post Industry Exchange Data 2024: Commvault, AWS and Kelyn experts on tuning up CX through smart data management first appeared on Federal News Network.

]]>

Improving citizen or customer experience requires significant modernization of the systems that support digital services. If agencies consider convenience to the customer as a measure of better CX, then convenience should also improve for the agency’s own business owners and IT staff.

That’s according to Richard Breakiron, senior director for strategic initiatives for the Americas public sector at Commvault.

“Convenience isn’t only for the end customer. Behind the scenes, there are a lot of things that have been done to optimize the operation of a capability,” he said during Federal News Network’s Industry Exchange Data 2024.

For example, operations such as maneuvering workloads to, from and among clouds, and orchestrating cloud and on-premise data center operations rank among candidates for automating, he said.

Added Kevin Cronin, co-founder and president of Kelyn Technologies: “Really, what you need to do is step back, take time, gather your thoughts and put together a really good plan on how you can utilize the technologies that are in the cloud that present you opportunities to make things more efficient.”

That transformation of the backend establishes the foundation for improved CX.

“When you make things more efficient, you end up with a happier set of customers, and the convenience factor goes through the roof,” Cronin said.

Do an outside-in analysis of your digital experience

In any CX-focused, data-centric modernization, it’s wise to view applications from the outside in — that is, from the point of view of the ultimate user, said David Rubal, head of U.S. federal business development for AWS storage, part of Amazon Web Services. Then the agency can more effectively design an infrastructure to support the application needs.

“Especially around government, you have to work backward from the capability of the requirements to actually ensure that what’s being developed aligns,” Rubal said. “That applies to the full stack — everything from the network infrastructure up through the application and then the user experience.”

Modernization in service of CX “really comes down to getting to the point where operational consistency is an objective,” Rubal added.

The objective of operational consistency must apply throughout an agency’s infrastructure, “however the technology is instantiated,” he said. Whether in the data center, in a commercial cloud, in an edge computing facility with limited bandwidth, “you want that same operational consistency so that when you create processes, it’s the same experience from edge to cloud, and cloud to edge.”

Cronin said Kelyn tries to integrate Commvault capabilities in data management, backup and resiliency and integrate it with AWS storage.

“We created a process for making sure that the outcome is the same across the board, whether it’s an on-prem customer, an AWS customer or a hybrid customer,” he said.

Along with operational efficiency, modernization must include an updated strategy for data, Breakiron said. “Data is that lifeblood today,” he said.

Securing it becomes especially important because applications, in contemporary digital services, expose data to external users.

“More than ever, the federal government is looking to try and interface with its citizenry and make itself transparent,” Breakiron said. Agencies “want to expose the data. And yet they need to make sure that the data is secure, that it’s not corrupted, that it doesn’t put agencies at risk when they let a public portal be open.”

Make data plan central part of your modernization strategy

The modernization strategy must encompass data storage, access and backup, Cronin said. “You need a really good plan to make sure, if you’re going to have a backup in a different location or a recovery point, that all of the data is there,” he said. “You need to test it. You need to make sure that the architecture that you’ve designed works.”

Breakiron emphasized the need to not underestimate the complexity at the back end, so users have a seamless and intuitive experience at the front end of an application.

In the goal of easy self-service, “it’s got to be very user-friendly, it’s got to be very, very intuitive,” Breakiron said. “And the complexity to do that behind the scenes goes up exponentially the easier you want to make that interface.”

He added that a classic engineering premise still applies to managing that complexity.

“You will only be able to manage the complexity if you break down the storage plane, from the data management plane, from the control plane,” Breakiron said. “By separating those out, you get enormous convenience and greater security because now you can build security in at appropriate levels based on who the user is.”

In short, modernizing should incorporate the notions of elasticity, flexibility and automation, Rubal said. “That allows you to achieve the mission capabilities and mission faster in a more operationally consistent way.”

Discover more tips and advices shared during Industry Exchange Data 2024 now.

The post Industry Exchange Data 2024: Commvault, AWS and Kelyn experts on tuning up CX through smart data management first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/industry-exchange-data-2024-commvault-aws-and-kelyn-experts-on-tuning-up-cx-through-smart-data-management/feed/ 0