Ask the CIO Podcasts - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Thu, 13 Jun 2024 18:11:25 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Ask the CIO Podcasts - Federal News Network https://federalnewsnetwork.com 32 32 How the Army is always testing, training on zero trust https://federalnewsnetwork.com/ask-the-cio/2024/06/how-the-army-is-always-testing-training-on-zero-trust/ https://federalnewsnetwork.com/ask-the-cio/2024/06/how-the-army-is-always-testing-training-on-zero-trust/#respond Thu, 13 Jun 2024 12:49:20 +0000 https://federalnewsnetwork.com/?p=5039061 The Army I Corps used the recent Yama Sakura 85 exercise to further prove out how to create a single, secure network to share information with allied partners.

The post How the Army is always testing, training on zero trust first appeared on Federal News Network.

]]>
var config_5039123 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB7878413880.mp3?updated=1718282721"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"How the Army is always testing, training on zero trust","description":"[hbidcpodcast podcastid='5039123']nnThe Army tackled one of its toughest challenges: Creating a common operating picture for all of its allied partners.nnThe recent <a href="https:\/\/www.army.mil\/article\/272369\/i_corps_and_allies_demonstrate_joint_force_readiness_during_yama_sakura_85" target="_blank" rel="noopener">Yama Sakura 85 exercise<\/a> demonstrated how the Army, the Australians and the Japanese could securely share information by using an architecture based on zero trust principles.nnCol. Rett Burroughs, the chief information officer & G6 for the Army\u2019s I Corps, said over the course of the 10-to-12 day training event last December, the Army successfully brought their allied leaders onto a single and secured network <a href="https:\/\/federalnewsnetwork.com\/army\/2023\/08\/army-preparing-to-take-zero-trust-to-tactical-edge\/">at the tactical edge<\/a>.nn[caption id="attachment_5039095" align="alignleft" width="450"]<img class="wp-image-5039095 size-full" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/06\/rett-burroughs.jpg" alt="" width="450" height="300" \/> Col. Rett Burroughs is the chief information officer and G6 for the Army\u2019s I Corps.[\/caption]nn\u201cWhat we are looking at is properly being distributed across the entirety of the Pacific. We could have a command and control node anywhere in Australia, Thailand, Philippines, Japan, Korea, Hawaii, Guam or Alaska, and back here at Joint Base Lewis McChord, Washington so that now every node has roles and responsibilities. How do we ensure that conductivity happens across all of those different nodes that are very disparate and spread out? And then how do we leverage the technology of transport to ensure that we're getting applications all the way to the edge?\u201d Burroughs said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/"><em><strong>Ask the CIO<\/strong><\/em><\/a>. \u201cWe spent months preparing to ensure we had right safeguards in place. In its simplest form, in the application for the warfighter, which is definitely my area of concern, it brought the Australians and the Japanese together because before it was the Australians and the Americans, and then it was the Americans and the Japanese. The Australians couldn't be in the same Tactical Operations Center as the Japanese. Now we have the ability for the first Australian division commander to talk directly with senior generals from the Japanese Ground Force Command.\u201dnnBurroughs said in previous exercises, the Americans and Australians would talk, and then the Americans and Japanese would talk, with the Army acting as the \u201cgo-between\u201d for the Australians and Japanese. And Burroughs readily admits everyone knows what happens when you play the game of telephone.nn\u201cOur goal here was to establish <a href="https:\/\/federalnewsnetwork.com\/army\/2024\/05\/army-turning-up-cyber-protections-of-network-data-access\/">one common operating picture<\/a> and the ability to voice video chat, and share specific information,\u201d he said. \u201cThe application of this proved critical in the ability for staff to make informed recommendations, and for commanders to make informed decisions. We weren't just slinging all this data just because commanders need and want everything.\u201dn<h2>Broader application than just the Army<\/h2>nThe success of the Yama Sakura 85 exercise proved this shared network and zero trust concept for more than just the Army, but any federal organization can take the basic concepts to create a common operating picture.nnJohn Sahlin, the vice president of cyber solutions for General Dynamics-IT, which supported the Army with integration expertise, said these same approaches could help agencies such as FEMA, which has to create shared networks to help cities or states recover from disasters.nn\u201cI've been fascinated by this problem set ever since I deployed for the Hurricane Katrina relief efforts back about 15 years ago. We started thinking about a military mission for that humanitarian assistance effort and it turned very quickly into an interagency and even local government support mission,\u201d Sahlin said. \u201cWe had good communications. We had a good sight picture. We had good mapping data, which nobody else in the area did. We had to quickly share that data with first responders, the local hospital, the parish sheriff, non-government organizations like the Red Cross. I think that these are lessons of zero trust at the tactical edge for information sharing to inform that on scene commander, are lessons that can be learned, not only for the military at the tactical edge, but for any organization that has field-deployed, forward-deployed organizations that need to share data to execute a mission rapidly and make those changes dynamically with first responders with interagency support, things like that.\u201dnnBurroughs added this approach of creating a distributed network supported by zero trust tools isn\u2019t just important for the tactical edge, but for Army commanders in garrison or commands who have to coordinate with the National Guard or local first responder communities or anyone outside of the service.nn\u201cNow we don't have to have these disparate networks that do not talk to each other because of classification and policy, which you clearly went through during the Katrina catastrophe,\u201d he said. \u201cNow what we're doing is we're taking need to figure this out on the fly out during a catastrophe. We're actually getting ahead of it now by addressing it before the next catastrophe. So when something does come in competition or crisis, we're actually able to deal with it in a methodical way instead of reacting.\u201dn<h2>Shift toward data-centricity<\/h2>nIn many ways what Burroughs and Sahlin are describing is how the Army, and really every agency, must be more of a <a href="https:\/\/federalnewsnetwork.com\/army\/2024\/06\/gen-rey-reflects-on-leading-network-cross-functional-team\/">data-centric organization<\/a>.nnLt. Col. Roberto Nunez, the chief of signal services support for Army I Corps, said the implementation of zero trust capabilities forces the end users to shift that data culture because they have to tag and label information much more specifically and consistently.nn\u201cYou can say \u2018all right, here's all my data that I want to share, all my users that are also tagged and labeled as well as what they're authorized to use and what they cannot use. Therefore, you can plug in with other mission partners to share that information and you can create that common environment moving forward, whether it's joint coalition, at least from a DoD point of view,\u201d he said. \u201cIf you want third parties to join in, whether it\u2019s corporate America, academics, other organizations or other government agencies, you can do that if everything's data-centric, labeled and tagged accordingly. This is what is great about zero trust.\u201dnnBurroughs said planning for the next Yama Sakura 87 exercise in December already is underway. But he said these capabilities aren\u2019t turned on during the exercise and then turned off. The network is always on and therefore the Army is always iterating how to make secure information sharing better, faster and easier.nnChief Warrant Officer 4 Phil Dieppa, a senior services engineer for Army I Corps, said what the Yama Sakura 87 exercise and other demonstrations have shown the service that the \u201ccome as you are\u201d model works because of the zero trust capabilities.nn\u201cThe great thing about zero trust is that we don't trust anything until we explicitly have that conversation and say that \u2018I trust you.\u2019 Once we do that, then we can start communicating and making those services available one at a time,\u201d he said.nn nn "}};

The Army tackled one of its toughest challenges: Creating a common operating picture for all of its allied partners.

The recent Yama Sakura 85 exercise demonstrated how the Army, the Australians and the Japanese could securely share information by using an architecture based on zero trust principles.

Col. Rett Burroughs, the chief information officer & G6 for the Army’s I Corps, said over the course of the 10-to-12 day training event last December, the Army successfully brought their allied leaders onto a single and secured network at the tactical edge.

Col. Rett Burroughs is the chief information officer and G6 for the Army’s I Corps.

“What we are looking at is properly being distributed across the entirety of the Pacific. We could have a command and control node anywhere in Australia, Thailand, Philippines, Japan, Korea, Hawaii, Guam or Alaska, and back here at Joint Base Lewis McChord, Washington so that now every node has roles and responsibilities. How do we ensure that conductivity happens across all of those different nodes that are very disparate and spread out? And then how do we leverage the technology of transport to ensure that we’re getting applications all the way to the edge?” Burroughs said on Ask the CIO. “We spent months preparing to ensure we had right safeguards in place. In its simplest form, in the application for the warfighter, which is definitely my area of concern, it brought the Australians and the Japanese together because before it was the Australians and the Americans, and then it was the Americans and the Japanese. The Australians couldn’t be in the same Tactical Operations Center as the Japanese. Now we have the ability for the first Australian division commander to talk directly with senior generals from the Japanese Ground Force Command.”

Burroughs said in previous exercises, the Americans and Australians would talk, and then the Americans and Japanese would talk, with the Army acting as the “go-between” for the Australians and Japanese. And Burroughs readily admits everyone knows what happens when you play the game of telephone.

“Our goal here was to establish one common operating picture and the ability to voice video chat, and share specific information,” he said. “The application of this proved critical in the ability for staff to make informed recommendations, and for commanders to make informed decisions. We weren’t just slinging all this data just because commanders need and want everything.”

Broader application than just the Army

The success of the Yama Sakura 85 exercise proved this shared network and zero trust concept for more than just the Army, but any federal organization can take the basic concepts to create a common operating picture.

John Sahlin, the vice president of cyber solutions for General Dynamics-IT, which supported the Army with integration expertise, said these same approaches could help agencies such as FEMA, which has to create shared networks to help cities or states recover from disasters.

“I’ve been fascinated by this problem set ever since I deployed for the Hurricane Katrina relief efforts back about 15 years ago. We started thinking about a military mission for that humanitarian assistance effort and it turned very quickly into an interagency and even local government support mission,” Sahlin said. “We had good communications. We had a good sight picture. We had good mapping data, which nobody else in the area did. We had to quickly share that data with first responders, the local hospital, the parish sheriff, non-government organizations like the Red Cross. I think that these are lessons of zero trust at the tactical edge for information sharing to inform that on scene commander, are lessons that can be learned, not only for the military at the tactical edge, but for any organization that has field-deployed, forward-deployed organizations that need to share data to execute a mission rapidly and make those changes dynamically with first responders with interagency support, things like that.”

Burroughs added this approach of creating a distributed network supported by zero trust tools isn’t just important for the tactical edge, but for Army commanders in garrison or commands who have to coordinate with the National Guard or local first responder communities or anyone outside of the service.

“Now we don’t have to have these disparate networks that do not talk to each other because of classification and policy, which you clearly went through during the Katrina catastrophe,” he said. “Now what we’re doing is we’re taking need to figure this out on the fly out during a catastrophe. We’re actually getting ahead of it now by addressing it before the next catastrophe. So when something does come in competition or crisis, we’re actually able to deal with it in a methodical way instead of reacting.”

Shift toward data-centricity

In many ways what Burroughs and Sahlin are describing is how the Army, and really every agency, must be more of a data-centric organization.

Lt. Col. Roberto Nunez, the chief of signal services support for Army I Corps, said the implementation of zero trust capabilities forces the end users to shift that data culture because they have to tag and label information much more specifically and consistently.

“You can say ‘all right, here’s all my data that I want to share, all my users that are also tagged and labeled as well as what they’re authorized to use and what they cannot use. Therefore, you can plug in with other mission partners to share that information and you can create that common environment moving forward, whether it’s joint coalition, at least from a DoD point of view,” he said. “If you want third parties to join in, whether it’s corporate America, academics, other organizations or other government agencies, you can do that if everything’s data-centric, labeled and tagged accordingly. This is what is great about zero trust.”

Burroughs said planning for the next Yama Sakura 87 exercise in December already is underway. But he said these capabilities aren’t turned on during the exercise and then turned off. The network is always on and therefore the Army is always iterating how to make secure information sharing better, faster and easier.

Chief Warrant Officer 4 Phil Dieppa, a senior services engineer for Army I Corps, said what the Yama Sakura 87 exercise and other demonstrations have shown the service that the “come as you are” model works because of the zero trust capabilities.

“The great thing about zero trust is that we don’t trust anything until we explicitly have that conversation and say that ‘I trust you.’ Once we do that, then we can start communicating and making those services available one at a time,” he said.

 

 

The post How the Army is always testing, training on zero trust first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/06/how-the-army-is-always-testing-training-on-zero-trust/feed/ 0
Grants procurement pilots demonstrate speed to modernization https://federalnewsnetwork.com/ask-the-cio/2024/06/grants-procurement-pilots-demonstrate-speed-to-modernization/ https://federalnewsnetwork.com/ask-the-cio/2024/06/grants-procurement-pilots-demonstrate-speed-to-modernization/#respond Mon, 10 Jun 2024 19:10:46 +0000 https://federalnewsnetwork.com/?p=5034713 Andrea Sampanis, the acting director of the Grants QSMO in HHS, said her team helped three small agencies adopt award management systems more easily.

The post Grants procurement pilots demonstrate speed to modernization first appeared on Federal News Network.

]]>
var config_5034924 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB1428896307.mp3?updated=1718045298"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"Grants procurement pilots demonstrate speed to modernization","description":"[hbidcpodcast podcastid='5034924']nnThe Grants Quality Service Management Office over the last year helped several micro agencies buy award management services.nnThis pilot was part of how the QSMO is crawling before it tries to walk or run with larger agencies.nnAndrea Sampanis, the acting director of the Grants Quality Service Management Office in the Department of Health and Human Services, said the procurement pilots with AmeriCorps, the Inter-American Foundation and the Northern Border Regional Commission opened the door to bigger possibilities to modernize federal grant services.nn[caption id="attachment_5034846" align="alignright" width="384"]<img class="wp-image-5034846 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/06\/andrea-sampanis.jpg" alt="" width="384" height="384" \/> Andrea Sampanis is the acting director of the Grants Quality Service Management Office (QSMO) in HHS.[\/caption]nn\u201cWe worked with them to explore the vendors on our Catalog of Market Research, making sure they were ready to meet their needs and helping to support them through the procurement process,\u201d Sampanis said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cIAF and NBRC are live, on target and on budget, which is not an easy thing to do. AmeriCorps is expected to go live this fall. Huge kudos to these three agencies, as they were prepared to be good customers, willing to accept the system as-is and supported by great leaders in their chief information officer and chief procurement offices.\u00a0 Their grants teams came together to support a great vendor product from our Catalog of Market Research.\u201dnnWhile the AmeriCorps, the Inter-American Foundation and the Northern Border Regional Commission are considered micro agencies, the amount of money each of them awards through grants is anything but small. Sampanis said the AmeriCorps is more like a medium-sized agency when looking at the amount of money it awards through grants. In fiscal 2024, for example, the agency <a href="https:\/\/americorps.gov\/sites\/default\/files\/document\/AmeriCorps-FY-2024-Plan-for-Grantmaking.pdf" target="_blank" rel="noopener">expects to award<\/a> $577 million in grants.nnThe Inter-American Foundation and NBRC are much smaller with IAF, awarding about $145 million and about $50 million in grants, respectively.n<h2>Grants QSMO aims to speed acquisition<\/h2>nWhile these three agencies don\u2019t reach the billions HHS or the Education Department or the NASA hand out, Sampanis said demonstrating how the procurement assistance pilot works opens the door to improve and expand the QSMO\u2019s efforts.nnThe QSMO marketplace current has approved seven grants management system providers and is in the middle of conducting market research to expand its services.nn\u201cWe have one quote that says having access to Grants QSMO market research puts you 1,000 steps ahead in your procurement. It\u2019s our goal to speed up the acquisition process and give agencies more buying confidence as they are pursuing a vendor on our catalog.\u00a0 The vendors on our catalog are selected to support meeting grants standards and align to 2CFR 200 requirements,\u201d Sampanis said. \u201cIt just lets them really focus their attention on a fewer number of providers to really say, \u2018Hey, this solution is purpose built for grants. It's an award management solution that is software-as-a-service and very configurable.\u2019 It should feel easy. They don't have to go and renegotiate a contract.\u201dnnThe QSMO also works with the agency\u2019s CIO and security leadership, helps develop performance work statements and serves as advisors during the entire acquisition phase.nn\u201cI always encourage agencies to meet with all the vendors on our Catalog of Market Research to understand what's out there and share their specific needs. I think they learn a lot about themselves by talking to the vendors,\u201d Sampanis said. \u201cI helped them all the way through the pilot because I'm learning a lot. Every time I hear a contracting officer ask a new question, I think, \u2018hey, that's something I need in my catalog because that's true.\u2019 I always say our goal is to speed up an agency\u2019s acquisition and give them buying confidence.\u201dnnHHS has led the <a href="https:\/\/ussm.gsa.gov\/marketplace\/grm\/" target="_blank" rel="noopener">Grants QSMO<\/a> since January 2021 and has been building its services over the last few years.nnWith the Office of Management and Budget finalizing the update to the <a href="https:\/\/federalnewsnetwork.com\/management\/2024\/04\/ombs-new-guidance-rfi-boost-grant-modernization-efforts\/">governmentwide grants guidance<\/a> under 2 CFR earlier this year, standardizing certain key areas like <a href="https:\/\/federalnewsnetwork.com\/management\/2024\/04\/hhs-proves-nofos-can-be-less-complex-easier-for-applicants\/">notices of funding opportunities<\/a> and overall trying to expand access to more than $1.2 trillion in grants and cooperative assistance agencies pay out each year, Sampanis said the QSMO is ready to expand its services and offerings.n<h2>Two common drivers of grants modernization<\/h2>nHaving that baseline understanding and confidence in the marketplace is a key factor in success, said Wagish Bhartiya, the chief growth officer for REI Systems, which helps agencies modernize their grant systems.nnBhartiya said there are two basic drivers of <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2023\/01\/grants-qsmo-shifts-latest-attempt-to-modernize-systems-into-next-gear\/">grant modernization<\/a>. The first is budget and second is technology.nn\u201cThere has been a greater focus on budget and how much of our budget goes towards grant funding and how that funding is being deployed? How much of that is serving management processes, some of the overhead aspects of grant management, which will exist inherently, versus how much should be deployed into the community? That analysis, I think, is getting more acute,\u201d he said. \u201cThe technology itself has evolved and shipped in a way that, I think, is much more possible now to be thoughtful about performance and mission. The technology is enabling some of this some of these questions to be asked because we now have the potential and the power to look at it for the first time.\u201dnnThese two big trends are part of how grants providers are shifting their mindsets away from being so compliance focused to spending more time and money on measuring and ensuring outcomes.nn\u201cThere's all these dollars flowing through our grant programs so we need to start to think just as much about the downside, protecting from a compliance and a risk mitigation perspective, as the upside into the mission impact in terms of what are the tangible and successful outcomes,\u201d Bhartiya said. \u201cThe other big theme is customer experience and user experience, and now the grantee experience.\u201dnnHe said this updated point of view is part of why many grant providers are more <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2021\/04\/hhs-qsmo-sees-6b-more-in-grants-handled-through-shared-solutions-this-year\/">willing to change<\/a> today than ever before. He said this means the singularity of the way grants management worked over the last few decades is going away.nn\u201cEvery grant program thinks they're a snowflake and they think they're special or unique and actually bespoke. But when you zoom out, you see that actually 85% of what a grant making agency does is essentially the same in the core lifecycle design,\u201d he said. \u201cConvincing them that they don't need to make everything bespoke and tailored to the Nth degree because they can leverage best practices, use what's worked for other agencies because there's a chance to reduce the burden on their staff and on the recipient community is part of the challenge.\u201dnnBhartiya added that the benefits of an end-to-end system, that\u2019s in the cloud are becoming more clear to agencies."}};

The Grants Quality Service Management Office over the last year helped several micro agencies buy award management services.

This pilot was part of how the QSMO is crawling before it tries to walk or run with larger agencies.

Andrea Sampanis, the acting director of the Grants Quality Service Management Office in the Department of Health and Human Services, said the procurement pilots with AmeriCorps, the Inter-American Foundation and the Northern Border Regional Commission opened the door to bigger possibilities to modernize federal grant services.

Andrea Sampanis is the acting director of the Grants Quality Service Management Office (QSMO) in HHS.

“We worked with them to explore the vendors on our Catalog of Market Research, making sure they were ready to meet their needs and helping to support them through the procurement process,” Sampanis said on Ask the CIO. “IAF and NBRC are live, on target and on budget, which is not an easy thing to do. AmeriCorps is expected to go live this fall. Huge kudos to these three agencies, as they were prepared to be good customers, willing to accept the system as-is and supported by great leaders in their chief information officer and chief procurement offices.  Their grants teams came together to support a great vendor product from our Catalog of Market Research.”

While the AmeriCorps, the Inter-American Foundation and the Northern Border Regional Commission are considered micro agencies, the amount of money each of them awards through grants is anything but small. Sampanis said the AmeriCorps is more like a medium-sized agency when looking at the amount of money it awards through grants. In fiscal 2024, for example, the agency expects to award $577 million in grants.

The Inter-American Foundation and NBRC are much smaller with IAF, awarding about $145 million and about $50 million in grants, respectively.

Grants QSMO aims to speed acquisition

While these three agencies don’t reach the billions HHS or the Education Department or the NASA hand out, Sampanis said demonstrating how the procurement assistance pilot works opens the door to improve and expand the QSMO’s efforts.

The QSMO marketplace current has approved seven grants management system providers and is in the middle of conducting market research to expand its services.

“We have one quote that says having access to Grants QSMO market research puts you 1,000 steps ahead in your procurement. It’s our goal to speed up the acquisition process and give agencies more buying confidence as they are pursuing a vendor on our catalog.  The vendors on our catalog are selected to support meeting grants standards and align to 2CFR 200 requirements,” Sampanis said. “It just lets them really focus their attention on a fewer number of providers to really say, ‘Hey, this solution is purpose built for grants. It’s an award management solution that is software-as-a-service and very configurable.’ It should feel easy. They don’t have to go and renegotiate a contract.”

The QSMO also works with the agency’s CIO and security leadership, helps develop performance work statements and serves as advisors during the entire acquisition phase.

“I always encourage agencies to meet with all the vendors on our Catalog of Market Research to understand what’s out there and share their specific needs. I think they learn a lot about themselves by talking to the vendors,” Sampanis said. “I helped them all the way through the pilot because I’m learning a lot. Every time I hear a contracting officer ask a new question, I think, ‘hey, that’s something I need in my catalog because that’s true.’ I always say our goal is to speed up an agency’s acquisition and give them buying confidence.”

HHS has led the Grants QSMO since January 2021 and has been building its services over the last few years.

With the Office of Management and Budget finalizing the update to the governmentwide grants guidance under 2 CFR earlier this year, standardizing certain key areas like notices of funding opportunities and overall trying to expand access to more than $1.2 trillion in grants and cooperative assistance agencies pay out each year, Sampanis said the QSMO is ready to expand its services and offerings.

Two common drivers of grants modernization

Having that baseline understanding and confidence in the marketplace is a key factor in success, said Wagish Bhartiya, the chief growth officer for REI Systems, which helps agencies modernize their grant systems.

Bhartiya said there are two basic drivers of grant modernization. The first is budget and second is technology.

“There has been a greater focus on budget and how much of our budget goes towards grant funding and how that funding is being deployed? How much of that is serving management processes, some of the overhead aspects of grant management, which will exist inherently, versus how much should be deployed into the community? That analysis, I think, is getting more acute,” he said. “The technology itself has evolved and shipped in a way that, I think, is much more possible now to be thoughtful about performance and mission. The technology is enabling some of this some of these questions to be asked because we now have the potential and the power to look at it for the first time.”

These two big trends are part of how grants providers are shifting their mindsets away from being so compliance focused to spending more time and money on measuring and ensuring outcomes.

“There’s all these dollars flowing through our grant programs so we need to start to think just as much about the downside, protecting from a compliance and a risk mitigation perspective, as the upside into the mission impact in terms of what are the tangible and successful outcomes,” Bhartiya said. “The other big theme is customer experience and user experience, and now the grantee experience.”

He said this updated point of view is part of why many grant providers are more willing to change today than ever before. He said this means the singularity of the way grants management worked over the last few decades is going away.

“Every grant program thinks they’re a snowflake and they think they’re special or unique and actually bespoke. But when you zoom out, you see that actually 85% of what a grant making agency does is essentially the same in the core lifecycle design,” he said. “Convincing them that they don’t need to make everything bespoke and tailored to the Nth degree because they can leverage best practices, use what’s worked for other agencies because there’s a chance to reduce the burden on their staff and on the recipient community is part of the challenge.”

Bhartiya added that the benefits of an end-to-end system, that’s in the cloud are becoming more clear to agencies.

The post Grants procurement pilots demonstrate speed to modernization first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/06/grants-procurement-pilots-demonstrate-speed-to-modernization/feed/ 0
How the pandemic changed IRS technology for good https://federalnewsnetwork.com/ask-the-cio/2024/05/how-the-pandemic-changed-irs-technology-for-good/ https://federalnewsnetwork.com/ask-the-cio/2024/05/how-the-pandemic-changed-irs-technology-for-good/#respond Wed, 29 May 2024 13:01:03 +0000 https://federalnewsnetwork.com/?p=5018259 Former IRS CIO Nancy Sieger, who will retire on June 1 after more than 40 years in government, said she found success during the pandemic by managing its risks.

The post How the pandemic changed IRS technology for good first appeared on Federal News Network.

]]>
var config_5019162 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB1630036841.mp3?updated=1716987467"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"How the pandemic changed IRS technology for good","description":"[hbidcpodcast podcastid='5019162']nnThrough the pandemic, the IRS learned it can move with urgency. And now that the emergency has subsided, Nancy Sieger, the former IRS chief information officer, believes that lesson isn\u2019t going to waste.nn[caption id="attachment_4491053" align="alignright" width="228"]<img class="size-full wp-image-4491053" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/03\/nancy-sieger.jpg" alt="" width="228" height="296" \/> Nancy Sieger is retiring from federal service after serving as the IRS CIO and Treasury Department's CTO.[\/caption]nnSieger, who will retire on June 1 after more than 40 years of federal service, including the last one as the Treasury chief technology officer, said IRS is building on the IT modernization lessons learned over the past few years.nn\u201cI think technologists saved the day during the pandemic. As the IRS CIO, I had the opportunity to lead IRS efforts to ensure that services to the public were handled in the most efficient way possible. If you think back to that time, businesses shut down, cities were practically shut down, and our economy was suffering and human beings were suffering. IRS focused really hard to issue three rounds of Economic Impact Payments. I am most proud of how IRS leadership and employees rallied to get money to the people in this country who needed it the most,\u201d Sieger said during an \u201cexit\u201d interview on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe had a principle that any new technology would be built in a modernized way. We were really good at relying on the older systems and delivering fast. One of the opportunities we had with the <a href="https:\/\/federalnewsnetwork.com\/management\/2020\/10\/pandemic-workload-brought-irs-to-the-limit-of-doing-more-with-less\/">Economic Impact Payments<\/a>, looking to the future, feeling like IRS might be called upon again to do something similar. We had to challenge ourselves to say that may be easy and fast to build upon old operations, but how do we do this in a modernized way so that it's repeatable? There were three rounds of payments, each round of payments came faster and faster, culminating within 24 hours. The Economic Impact Payments and that processing were built using new tools, new testing methods, new quality assurance processes and built in a modernized way. If IRS has to do that again, the strong foundation will be there.\u201dnnSieger said it took constant reminders to build the confidence of the developers and engineers to the point where she and then-IRS Deputy CIO Kaschit Pandya, who is now the agency\u2019s CTO, met daily with the technology workers who were writing code and analyzing it.nn\u201cWe often had to say to our folks, \u2018no, no, you have my permission to do it this way. Not [the old] way. It was risky. We managed those risks,\u201d she said. \u201cBut ultimately, it resulted in little-to-no rework. I would say to you, on behalf of Kaschit and myself, the hours we spent with a team doing this the way it needed to be done was very fulfilling.\u201dn<h2>IRS can accept, manage risks<\/h2>nThat experience has helped the IRS continue to launch modern services, such as the direct file application, <a href="https:\/\/federalnewsnetwork.com\/technology-main\/2024\/03\/the-irs-launches-direct-file-a-pilot-program-for-free-online-tax-filing-available-in-12-states\/">launched in March<\/a> across 12 states. The IRS said the <a href="https:\/\/directfile.irs.gov\/" target="_blank" rel="noopener">direct file pilot<\/a> helped more than 140,000 citizens file their taxes online and for free.nnThere are plenty more opportunities for the technology development lessons learned from the pandemic to continue to spread across the IRS. Commissioner Danny Werfel told lawmakers in April that the tax agency <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2024\/04\/irs-seeks-104b-for-multi-year-modernization-fund-to-maintain-customer-service-improvements\/">needs $104 billion<\/a> for a multi-year modernization effort.nnSieger said the experience over the last three-plus years <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2020\/11\/rettig-says-pandemic-gave-irs-momentum-to-overhaul-taxpayer-services\/">taught the IRS<\/a> it can accept and manage risks differently than before.nn\u201cWe took a lot of risks. We weighed those risks. We said, \u2018the worst thing that could happen is this. What are we going to do when that happens?\u2019\u201d she said. \u201cI think our greatest opportunity is not forgetting how we did that, and bringing that forward into future operations. I'm trying not to say don't be risk averse, but I'm going to say it. Don't be risk averse and accept measured risk; know what could happen, know how you'll adapt, but let's face it, in our personal lives, especially in the technology space, how many of us get an update on our smartphone that didn't work. But we know the next day it will be updated and fixed. Now I am not suggesting something so aggressive in government. But I am suggesting that we look back to how the government served this country during the pandemic and bring some of those skills and learnings forward to be even more effective and efficient in government service.\u201dnnOne of the biggest reasons for the IRS\u2019 success, beyond the urgency of the moment, was the top-cover leaders gave the developers. Sieger said helping employees reduce the fear of failure and ensuring they know they are not going to be left behind should something go wrong was a huge factor in the agency\u2019s success.nn\u201cAt the time, it was Commissioner Charles Rettig who was constantly keeping his hand on the pulse of the employees, working with Treasury to ensure that we were delivering the payments and processing tax returns and the IT workforce knew they had support. They were constantly asked, \u2018What do you need?\u2019 Sometimes they would tell us what they needed. Sometimes, I saw what they needed, and they wouldn't ask. There was a particular weekend where the team was working really hard,\u201d she said. \u201cThis was not a case of the workforce being hesitant to do new things. This was a case of the workforce having the skills they needed to do this in the most elegant way, and once leadership let them know \u2014 from Commissioner Rettig through the different deputy commissioners to myself and all the front line executives at the IRS who helped them \u2014 they were able to get things done and help the country. It was an example of coming together at the right time in the right way for the right outcome.\u201dnn nn "}};

Through the pandemic, the IRS learned it can move with urgency. And now that the emergency has subsided, Nancy Sieger, the former IRS chief information officer, believes that lesson isn’t going to waste.

Nancy Sieger is retiring from federal service after serving as the IRS CIO and Treasury Department’s CTO.

Sieger, who will retire on June 1 after more than 40 years of federal service, including the last one as the Treasury chief technology officer, said IRS is building on the IT modernization lessons learned over the past few years.

“I think technologists saved the day during the pandemic. As the IRS CIO, I had the opportunity to lead IRS efforts to ensure that services to the public were handled in the most efficient way possible. If you think back to that time, businesses shut down, cities were practically shut down, and our economy was suffering and human beings were suffering. IRS focused really hard to issue three rounds of Economic Impact Payments. I am most proud of how IRS leadership and employees rallied to get money to the people in this country who needed it the most,” Sieger said during an “exit” interview on Ask the CIO. “We had a principle that any new technology would be built in a modernized way. We were really good at relying on the older systems and delivering fast. One of the opportunities we had with the Economic Impact Payments, looking to the future, feeling like IRS might be called upon again to do something similar. We had to challenge ourselves to say that may be easy and fast to build upon old operations, but how do we do this in a modernized way so that it’s repeatable? There were three rounds of payments, each round of payments came faster and faster, culminating within 24 hours. The Economic Impact Payments and that processing were built using new tools, new testing methods, new quality assurance processes and built in a modernized way. If IRS has to do that again, the strong foundation will be there.”

Sieger said it took constant reminders to build the confidence of the developers and engineers to the point where she and then-IRS Deputy CIO Kaschit Pandya, who is now the agency’s CTO, met daily with the technology workers who were writing code and analyzing it.

“We often had to say to our folks, ‘no, no, you have my permission to do it this way. Not [the old] way. It was risky. We managed those risks,” she said. “But ultimately, it resulted in little-to-no rework. I would say to you, on behalf of Kaschit and myself, the hours we spent with a team doing this the way it needed to be done was very fulfilling.”

IRS can accept, manage risks

That experience has helped the IRS continue to launch modern services, such as the direct file application, launched in March across 12 states. The IRS said the direct file pilot helped more than 140,000 citizens file their taxes online and for free.

There are plenty more opportunities for the technology development lessons learned from the pandemic to continue to spread across the IRS. Commissioner Danny Werfel told lawmakers in April that the tax agency needs $104 billion for a multi-year modernization effort.

Sieger said the experience over the last three-plus years taught the IRS it can accept and manage risks differently than before.

“We took a lot of risks. We weighed those risks. We said, ‘the worst thing that could happen is this. What are we going to do when that happens?’” she said. “I think our greatest opportunity is not forgetting how we did that, and bringing that forward into future operations. I’m trying not to say don’t be risk averse, but I’m going to say it. Don’t be risk averse and accept measured risk; know what could happen, know how you’ll adapt, but let’s face it, in our personal lives, especially in the technology space, how many of us get an update on our smartphone that didn’t work. But we know the next day it will be updated and fixed. Now I am not suggesting something so aggressive in government. But I am suggesting that we look back to how the government served this country during the pandemic and bring some of those skills and learnings forward to be even more effective and efficient in government service.”

One of the biggest reasons for the IRS’ success, beyond the urgency of the moment, was the top-cover leaders gave the developers. Sieger said helping employees reduce the fear of failure and ensuring they know they are not going to be left behind should something go wrong was a huge factor in the agency’s success.

“At the time, it was Commissioner Charles Rettig who was constantly keeping his hand on the pulse of the employees, working with Treasury to ensure that we were delivering the payments and processing tax returns and the IT workforce knew they had support. They were constantly asked, ‘What do you need?’ Sometimes they would tell us what they needed. Sometimes, I saw what they needed, and they wouldn’t ask. There was a particular weekend where the team was working really hard,” she said. “This was not a case of the workforce being hesitant to do new things. This was a case of the workforce having the skills they needed to do this in the most elegant way, and once leadership let them know — from Commissioner Rettig through the different deputy commissioners to myself and all the front line executives at the IRS who helped them — they were able to get things done and help the country. It was an example of coming together at the right time in the right way for the right outcome.”

 

 

The post How the pandemic changed IRS technology for good first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/05/how-the-pandemic-changed-irs-technology-for-good/feed/ 0
The Marine Corps’ plan to further breakdown data siloes https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/ https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/#respond Fri, 24 May 2024 16:44:13 +0000 https://federalnewsnetwork.com/?p=5014286 Dr. Colin Crosby, the service data officer for the Marine Corps, said the first test of the API connection tool will use “dummy” logistics data.

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
var config_5014343 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2238077517.mp3?updated=1716568461"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"The Marine Corps\u2019 plan to further breakdown data siloes","description":"[hbidcpodcast podcastid='5014343']nnThe Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.nnAs part of <a href="https:\/\/www.mca-marines.org\/gazette\/fighting-smart\/#:~:text=Fighting%20Smart%20is%20a%20way,and%20combined%20arms%20more%20effective." target="_blank" rel="noopener">its goal<\/a> to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.nn\u201cReally over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,\u201d said Dr. Colin Crosby, the service data officer for the Marine Corps, on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe're working with what we call the functional area managers and their leads on the data that they own because this is all new in how we're operating. I need them to help me execute this agenda so that we can then create that API connection.\u201dnnLike many organizations, <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/03\/dod-cloud-exchange-renata-spinks-on-usmcs-acceleration-to-the-cloud\/">mission areas<\/a> own and manage the data, but sharing because of culture, technology and\/or policy can be difficult.nnCrosby said the API connection can help overcome <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2023\/04\/why-the-marine-corps-has-established-its-own-software-factory\/">many of these challenges<\/a>.nn\u201cOur first marker is to have a working API connection on test data. Once that happens, then we're going to start accelerating the work that we're doing,\u201d he said. \u201cWe're using logistics data so what we're doing is using a dummy data, and we're going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the\u00a0 online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.\u201dn<h2>Testing the API before production<\/h2>nOnce the API connection proves out, Crosby said the goal is to push data into the Marine Corps\u2019 Bolt platform, which runs on the Advana Jupiter platform.nnHe said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.nn\u201cAs we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, \u2018we want to be a part of this,\u2019\u201d Crosby said. \u201cThe training and education command is ready to go. So we're excited about it because now I don't have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.\u201dnnCrosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.nnWithout these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.nn\u201cWith the API, we're going to near-real time type of pull and push, which is speeding up the decision cycle,\u201d he said. \u201cThen there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it'd be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.\u201dnnThe API connection tool is one piece to the bigger Marine Corps effort to create an <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2022\/10\/as-data-fabric-comes-together-army-must-ensure-platforms-integrate\/">integrated mission and data fabric<\/a>. Crosby said that initiative also relies on the unification of the Marine Corps <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2024\/03\/how-the-marines-corps-got-ahead-of-the-zero-trust-curve\/">enterprise network<\/a> to bring the business side and the tactical side together into one environment.nn\u201cThe fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it\u2019s in, regardless of whatever database structure the data resides in,\u201d Crosby said. \u201cIt allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we've never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.\u201d"}};

The Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.

As part of its goal to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.

“Really over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,” said Dr. Colin Crosby, the service data officer for the Marine Corps, on Ask the CIO. “We’re working with what we call the functional area managers and their leads on the data that they own because this is all new in how we’re operating. I need them to help me execute this agenda so that we can then create that API connection.”

Like many organizations, mission areas own and manage the data, but sharing because of culture, technology and/or policy can be difficult.

Crosby said the API connection can help overcome many of these challenges.

“Our first marker is to have a working API connection on test data. Once that happens, then we’re going to start accelerating the work that we’re doing,” he said. “We’re using logistics data so what we’re doing is using a dummy data, and we’re going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the  online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.”

Testing the API before production

Once the API connection proves out, Crosby said the goal is to push data into the Marine Corps’ Bolt platform, which runs on the Advana Jupiter platform.

He said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.

“As we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, ‘we want to be a part of this,’” Crosby said. “The training and education command is ready to go. So we’re excited about it because now I don’t have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.”

Crosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.

Without these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.

“With the API, we’re going to near-real time type of pull and push, which is speeding up the decision cycle,” he said. “Then there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it’d be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.”

The API connection tool is one piece to the bigger Marine Corps effort to create an integrated mission and data fabric. Crosby said that initiative also relies on the unification of the Marine Corps enterprise network to bring the business side and the tactical side together into one environment.

“The fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it’s in, regardless of whatever database structure the data resides in,” Crosby said. “It allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we’ve never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.”

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/feed/ 0
CDC cuts the digital fat as part of its website redesign https://federalnewsnetwork.com/ask-the-cio/2024/05/cdc-cuts-the-digital-fat-as-part-of-its-website-redesign/ https://federalnewsnetwork.com/ask-the-cio/2024/05/cdc-cuts-the-digital-fat-as-part-of-its-website-redesign/#respond Fri, 17 May 2024 18:57:23 +0000 https://federalnewsnetwork.com/?p=5005752 Carol Crawford, the director for digital media at the CDC, said the website redesign reduced the site’s content by about 65%, making information easier to find.

The post CDC cuts the digital fat as part of its website redesign first appeared on Federal News Network.

]]>
var config_5005931 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2659715052.mp3?updated=1715971308"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"CDC cuts the digital fat as part of its website redesign","description":"[hbidcpodcast podcastid='5005931']nnOver the last decade, the Centers for Disease Control and Prevention\u2019s website became bloated, making information hard to find.nnAn 18-month long effort, called Clean Slate, helped the CDC cut the digital fat by 65%.nnCarol Crawford, the director for digital media at the CDC, said the agency used a customer-first approach to modernize its website, which <a href="https:\/\/www.cdc.gov\/about\/cdc-moving-forward\/newcdc-info.html" target="_blank" rel="noopener">relaunched<\/a> yesterday.nn[caption id="attachment_5005754" align="alignright" width="287"]<img class="wp-image-5005754 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/05\/carol-crawford-scaled.jpg" alt="" width="287" height="431" \/> Carol Crawford is the director for digital media at the CDC.[\/caption]nn\u201cIt was a complete overhaul of the whole user experience, and of course, a new look and feel,\u201d Crawford said on Ask the CIO. \u201cComing out of the pandemic, we really looked at what we wanted to improve and what we wanted to do different. This went also along with CDC\u2019s moving forward effort and, combining that, we launched what we call \u2018Clean Slate\u2019 and a big cornerstone of that project was starting over with a clean slate for CDC.gov. So that meant we were able to reduce about 65% of our content, which gave us more time and more energy to put toward improving the content that we had. We made a number of other updates that we thought would better improve the experience on CDC.gov.\u201dnnCrawford said some of changes to the website are basic like ensuring consistency in formats on all pages. Other changes are focused on the user such as CDC added page summaries to the top of every page so the citizen can quickly see if the page meets their needs.nn\u201cWe've also really streamlined the navigation. We call it content first navigation that will guide a user through the journey of the content that they're looking for,\u201d she said. \u201c We organize the content by three primary audiences just to make it a little easier to spot the content that is just for just for you, or just for what you you're looking for. And of course, we worked on the readability and the scanability of the pages on your desktop, mobile, iPad device. We've improved the fonts, for example, to make it easier to skim, kept our page length shorter, so that you can read quickly, and there is so much more.\u201dn<h2>No IT upgrades needed<\/h2>nOne factor that made the website revamp a little easier was the CDC didn\u2019t have to upgrade the underlying technology.nnCrawford said this let the CDC improve the existing technology stack, adding functionality like using metadata to automate pages where they used to manually update pages.nn\u201cWe've expanded our application programming interface (API) use. We've expanded our data functionality and data visualizations,\u201d she said. \u201cWe\u2019re thinking about <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2024\/03\/how-cdcs-data-office-is-applying-ai-to-public-health\/">using AI, machine learning<\/a>, natural language processing and some generative AI to really think about how to improve the quality of our content.\u201dnnThrough a series of surveys and other feedback approaches, the CDC found that citizens and other users were \u201coverwhelmed\u201d by the amount of information on the website. Crawford said that stopped users from finding what they needed.nn\u201cWe evaluated are these pieces of content that a user needs or ever looks for, are these old content and a number of other criteria, and really, it allowed us to just keep our highest performing content and the content that people really need each day,\u201d she said. \u201cWe really looked across our site to see where we could improve on duplicate content. We definitely looked at what people were getting from other servers or sites, but we also looked internally, like many people, we also had duplicate content that we wanted to fold together and make it easier for people to find it all in one place on our site.\u201dnnThe CDC\u2019s website revamp comes as the Office of Management and Budget is emphasizing specific improvements across all agencies. Just recently, OMB said the Digital Experience Council completed the first <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2024\/05\/a-new-push-by-omb-to-get-a-handle-on-10000-federal-websites\/">federal website inventory<\/a> and found more than 10,000 across the government. Clare Martorana, the federal chief information officer, said recently through this inventory, agencies will have a better idea of their entire ecosystem and what they need to do to secure it and improve the user experience.nnThe inventory is part of a bigger effort to improve the digital experience of users through the requirements laid out in <a href="https:\/\/federalnewsnetwork.com\/it-modernization\/2023\/09\/omb-gives-agencies-a-10-year-digital-services-transformation-framework\/">OMB\u2019s September memo<\/a> and the 21<sup>st<\/sup> Century IDEA Act.nnOMB also recently issued new guidance for how agencies should improve accessibility under Section 508 requirements. The <a href="https:\/\/federalnewsnetwork.com\/technology-main\/2023\/12\/omb-issues-digital-accessibility-guidelines-tells-agencies-to-set-up-public-feedback-mechanism\/">December memo<\/a>, the first from OMB in more than a decade, requires agencies to design and develop \u201caccessible digital experiences,\u201d by taking a number of steps.n<h2>CDC kept users front, center<\/h2>nCrawford said CDC used the U.S. Web Design System standards and leaned into human centered design tactics.nn\u201cWe worked together across all of our communicators at CDC. The entire group in my digital media division worked on this project along with our CDC.gov web council,\u201d she said. \u201cHuman Centered Design was absolutely the cornerstone of what we're doing. We made every decision thinking about what the users needed. We did extensive research on our audience needs. We included many steps during the process to collect information from our audiences. This included surveys, lots of user testing, things like a new rate this page feature. We also introduced a beta preview so that we could get lots of feedback from users and all in all we received about input from about 6,000 users so evolving, the audience was central to what we did.\u201dnnLike all agencies, the CDC had to take steps to make sure it was serving its diverse customer base. Crawford said that meant narrowing down their content around three particular audience areas:n<ul>n \t<li>The general public<\/li>n \t<li>Healthcare providers<\/li>n \t<li>Public health professionals<\/li>n<\/ul>n\u201cWe've tested specifically with those groups, and then with a lot of diversity within those groups,\u201d Crawford said. \u201cWe had to work across every content type CDC had and create ways that we knew would work for people. This meant engaging with people in the programs, the needs for audiences around flu might be different than the needs around audience around an injury topic, so we had to really work collectively. They have done the hard work of rewriting and reformatting content based on these best practices and the results of our testing.\u201dnnGoing forward, Crawford said CDC will continue user testing and collecting user feedback. She said the agency is considering at least quarterly user testing with real people on the site as well as pop-up and email surveys.nn "}};

Over the last decade, the Centers for Disease Control and Prevention’s website became bloated, making information hard to find.

An 18-month long effort, called Clean Slate, helped the CDC cut the digital fat by 65%.

Carol Crawford, the director for digital media at the CDC, said the agency used a customer-first approach to modernize its website, which relaunched yesterday.

Carol Crawford is the director for digital media at the CDC.

“It was a complete overhaul of the whole user experience, and of course, a new look and feel,” Crawford said on Ask the CIO. “Coming out of the pandemic, we really looked at what we wanted to improve and what we wanted to do different. This went also along with CDC’s moving forward effort and, combining that, we launched what we call ‘Clean Slate’ and a big cornerstone of that project was starting over with a clean slate for CDC.gov. So that meant we were able to reduce about 65% of our content, which gave us more time and more energy to put toward improving the content that we had. We made a number of other updates that we thought would better improve the experience on CDC.gov.”

Crawford said some of changes to the website are basic like ensuring consistency in formats on all pages. Other changes are focused on the user such as CDC added page summaries to the top of every page so the citizen can quickly see if the page meets their needs.

“We’ve also really streamlined the navigation. We call it content first navigation that will guide a user through the journey of the content that they’re looking for,” she said. “ We organize the content by three primary audiences just to make it a little easier to spot the content that is just for just for you, or just for what you you’re looking for. And of course, we worked on the readability and the scanability of the pages on your desktop, mobile, iPad device. We’ve improved the fonts, for example, to make it easier to skim, kept our page length shorter, so that you can read quickly, and there is so much more.”

No IT upgrades needed

One factor that made the website revamp a little easier was the CDC didn’t have to upgrade the underlying technology.

Crawford said this let the CDC improve the existing technology stack, adding functionality like using metadata to automate pages where they used to manually update pages.

“We’ve expanded our application programming interface (API) use. We’ve expanded our data functionality and data visualizations,” she said. “We’re thinking about using AI, machine learning, natural language processing and some generative AI to really think about how to improve the quality of our content.”

Through a series of surveys and other feedback approaches, the CDC found that citizens and other users were “overwhelmed” by the amount of information on the website. Crawford said that stopped users from finding what they needed.

“We evaluated are these pieces of content that a user needs or ever looks for, are these old content and a number of other criteria, and really, it allowed us to just keep our highest performing content and the content that people really need each day,” she said. “We really looked across our site to see where we could improve on duplicate content. We definitely looked at what people were getting from other servers or sites, but we also looked internally, like many people, we also had duplicate content that we wanted to fold together and make it easier for people to find it all in one place on our site.”

The CDC’s website revamp comes as the Office of Management and Budget is emphasizing specific improvements across all agencies. Just recently, OMB said the Digital Experience Council completed the first federal website inventory and found more than 10,000 across the government. Clare Martorana, the federal chief information officer, said recently through this inventory, agencies will have a better idea of their entire ecosystem and what they need to do to secure it and improve the user experience.

The inventory is part of a bigger effort to improve the digital experience of users through the requirements laid out in OMB’s September memo and the 21st Century IDEA Act.

OMB also recently issued new guidance for how agencies should improve accessibility under Section 508 requirements. The December memo, the first from OMB in more than a decade, requires agencies to design and develop “accessible digital experiences,” by taking a number of steps.

CDC kept users front, center

Crawford said CDC used the U.S. Web Design System standards and leaned into human centered design tactics.

“We worked together across all of our communicators at CDC. The entire group in my digital media division worked on this project along with our CDC.gov web council,” she said. “Human Centered Design was absolutely the cornerstone of what we’re doing. We made every decision thinking about what the users needed. We did extensive research on our audience needs. We included many steps during the process to collect information from our audiences. This included surveys, lots of user testing, things like a new rate this page feature. We also introduced a beta preview so that we could get lots of feedback from users and all in all we received about input from about 6,000 users so evolving, the audience was central to what we did.”

Like all agencies, the CDC had to take steps to make sure it was serving its diverse customer base. Crawford said that meant narrowing down their content around three particular audience areas:

  • The general public
  • Healthcare providers
  • Public health professionals

“We’ve tested specifically with those groups, and then with a lot of diversity within those groups,” Crawford said. “We had to work across every content type CDC had and create ways that we knew would work for people. This meant engaging with people in the programs, the needs for audiences around flu might be different than the needs around audience around an injury topic, so we had to really work collectively. They have done the hard work of rewriting and reformatting content based on these best practices and the results of our testing.”

Going forward, Crawford said CDC will continue user testing and collecting user feedback. She said the agency is considering at least quarterly user testing with real people on the site as well as pop-up and email surveys.

 

The post CDC cuts the digital fat as part of its website redesign first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/05/cdc-cuts-the-digital-fat-as-part-of-its-website-redesign/feed/ 0
Air Force increasing cloud capabilities for the warfighter https://federalnewsnetwork.com/ask-the-cio/2024/05/air-force-expanding-cloud-as-operational-tactical-lines-blur/ https://federalnewsnetwork.com/ask-the-cio/2024/05/air-force-expanding-cloud-as-operational-tactical-lines-blur/#respond Thu, 16 May 2024 16:14:53 +0000 https://federalnewsnetwork.com/?p=5003903 Venice Goodwine, the Air Force’s CIO, said one goal is to create more transparency on how much money mission owners are spending on cloud services.

The post Air Force increasing cloud capabilities for the warfighter first appeared on Federal News Network.

]]>
var config_5004140 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB8481707563.mp3?updated=1715875305"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"Air Force expanding cloud as operational, tactical lines blur","description":"[hbidcpodcast podcastid='5004140']nnThe Department of the Air Force\u2019s chief information officer\u2019s strategy to increase the capabilities of its airmen and women and guardians is centered on increasing the use of cloud services.nnVenice Goodwine, the Air Force\u2019s CIO, said the cloud cannot be thought of as just for business applications. The lines between the back office and the tactical edge have blurred, she said.nn[caption id="attachment_5003910" align="alignright" width="260"]<img class="wp-image-5003910 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/05\/venice-goodwine-2-scaled.jpg" alt="" width="260" height="325" \/> Venice Goodwine is the Department of the Air Force\u2019s chief information officer.[\/caption]nn\u201cI\u2019m expanding the cloud from NIPERNet [unclassified network] to SIPRNet [classified network] and also having all those capabilities as well in that cloud on both sides. As we think about the different classifications, how do we get there with those same human-to-human capabilities are important?\u201d said Goodwine said at the recent AFCEA NOVA Air Force IT Day, an excerpt of which was played on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cThe other thing when I'm thinking of the cloud, it's an investment. But I'm also going to create the transparency that we haven't seen before in the cloud. Now when I think financial operations in the cloud, I now can talk to my system owners about their investment in the cloud, tell them when to pay for reserve instances. I could talk to them about how can they make adjustments in their investment based on the usage or their computing and storage? I didn't have that visibility before.\u201dnnThe Air Force is planning to have a single tenet for Office 365 on the secret side, which is different than what the service did with its unclassified version, which had multiple tenetsnnSeveral other <a href="https:\/\/federalnewsnetwork.com\/on-dod\/2023\/10\/secret-level-version-of-microsoft-365-rolls-out-to-top-pentagon-offices-this-month\/">military services and agencies<\/a> also have rolled out O365 on the secret side recently.nn\u201cWhat's important for my cloud strategy is making sure that I have cloud at the tactical edge. That's my reliance on commercial cloud services at the edge because if I'm going to have decision advantage, I have to make sure that the data is available. The data needs to be where the warfighter is and the data needs to be in the cloud,\u201d Goodwine said. \u201cI don't intend to put the data in the continental United States (CONUS) when I'm fighting in INDOPACOM. I need the data there. But then I also need the cloud at the edge. I need the data at the edge. I need artificial intelligence to make sense of the data. And it needs to be trusted. So all the attributes, you talk about data, I need all of that there. So it's not just enterprise IT. It is it for the warfighter. That's my mantra and you'll hear me say that all the time and my team speak that same language.\u201dn<h2>Air Force expanding virtual environment<\/h2>nThe Air Force continues to mature its approach to buying cloud services. Goodwine, who <a href="https:\/\/federalnewsnetwork.com\/air-force\/2023\/08\/air-force-names-new-cio\/">became the CIO<\/a> in August, said the Joint Warfighting Cloud Capability (JWCC) remains the first option of where to buy cloud services, especially for new workloads. But, she said, those workloads and applications will remain in the CloudOne platform.nnThe Air Force is working on a new solicitation for CloudOne, called <a href="https:\/\/federalnewsnetwork.com\/air-force\/2024\/04\/air-force-begins-phase-2-of-enterprise-it-service-delivery\/">CloudOne Next<\/a>.nnThe Air Force released its request for information for CloudOne Next in September and just in March, it offered more details on its\u00a0<a href="https:\/\/sam.gov\/opp\/d4ff2b612d5e4b81ad6534dccc2af336\/view" target="_blank" rel="noopener">acquisition strategy<\/a>.nnThe Air Force expects to release three solicitations for CloudOne Next in the third quarter of 2024 and make the award in the fourth quarter of this year. It will be three single-award blanket purchase agreements on top of the schedules program run by the General Services Administration.nnAs part of this cloud expansion, Goodwine said the Air Force is developing a virtual environment to make it easier to access applications in a secure way.nn\u201cIf you're on your home computer, you have a Mac, you can go to portal.apps.mil and you can access your O365.You can be as productive as you need to be. There is no need for you to VPN in and you can use your home network,\u201d she said. \u201cYou want to be able to access your OneDrive, all your apps and email, you can do that today. You only VPN in because you're trying to get to some shared drives that we're going to shut down eventually anyway. So really, those are the things that we already have in play that we should take advantage of, especially now that we're in a hybrid environment. As we move forward, yes, understanding the work that's done, the hours required to do that work so that we can make better investment decisions about the technology that we want to use, so I do think there's a connection between technology and people hours.\u201dnnAdditionally, Goodwine said the <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2024\/03\/air-force-intelligence-cio-finding-ways-to-get-to-yes\/">Air Force will expand<\/a> its \u201cDesktop Anywhere\u201d initiative beyond just the Air Force Reserve Command.nn\u201cIt now has an Impact Level 5 authority to operate, and we're going to move it [off-premise] so we're expanding that. We'll have the ability to do more of these virtualized environments,\u201d she said. \u201cFrom a cybersecurity perspective, it\u2019s a great idea because I just reduced my attack surface and from a productivity perspective, it\u2019s absolutely faster, better, cheaper, and it now really allows you to be mobile, which is what I want my workforce to be the airmen and guardians.\u201d"}};

The Department of the Air Force’s chief information officer’s strategy to increase the capabilities of its airmen and women and guardians is centered on increasing the use of cloud services.

Venice Goodwine, the Air Force’s CIO, said the cloud cannot be thought of as just for business applications. The lines between the back office and the tactical edge have blurred, she said.

Venice Goodwine is the Department of the Air Force’s chief information officer.

“I’m expanding the cloud from NIPERNet [unclassified network] to SIPRNet [classified network] and also having all those capabilities as well in that cloud on both sides. As we think about the different classifications, how do we get there with those same human-to-human capabilities are important?” said Goodwine said at the recent AFCEA NOVA Air Force IT Day, an excerpt of which was played on Ask the CIO. “The other thing when I’m thinking of the cloud, it’s an investment. But I’m also going to create the transparency that we haven’t seen before in the cloud. Now when I think financial operations in the cloud, I now can talk to my system owners about their investment in the cloud, tell them when to pay for reserve instances. I could talk to them about how can they make adjustments in their investment based on the usage or their computing and storage? I didn’t have that visibility before.”

The Air Force is planning to have a single tenet for Office 365 on the secret side, which is different than what the service did with its unclassified version, which had multiple tenets

Several other military services and agencies also have rolled out O365 on the secret side recently.

“What’s important for my cloud strategy is making sure that I have cloud at the tactical edge. That’s my reliance on commercial cloud services at the edge because if I’m going to have decision advantage, I have to make sure that the data is available. The data needs to be where the warfighter is and the data needs to be in the cloud,” Goodwine said. “I don’t intend to put the data in the continental United States (CONUS) when I’m fighting in INDOPACOM. I need the data there. But then I also need the cloud at the edge. I need the data at the edge. I need artificial intelligence to make sense of the data. And it needs to be trusted. So all the attributes, you talk about data, I need all of that there. So it’s not just enterprise IT. It is it for the warfighter. That’s my mantra and you’ll hear me say that all the time and my team speak that same language.”

Air Force expanding virtual environment

The Air Force continues to mature its approach to buying cloud services. Goodwine, who became the CIO in August, said the Joint Warfighting Cloud Capability (JWCC) remains the first option of where to buy cloud services, especially for new workloads. But, she said, those workloads and applications will remain in the CloudOne platform.

The Air Force is working on a new solicitation for CloudOne, called CloudOne Next.

The Air Force released its request for information for CloudOne Next in September and just in March, it offered more details on its acquisition strategy.

The Air Force expects to release three solicitations for CloudOne Next in the third quarter of 2024 and make the award in the fourth quarter of this year. It will be three single-award blanket purchase agreements on top of the schedules program run by the General Services Administration.

As part of this cloud expansion, Goodwine said the Air Force is developing a virtual environment to make it easier to access applications in a secure way.

“If you’re on your home computer, you have a Mac, you can go to portal.apps.mil and you can access your O365.You can be as productive as you need to be. There is no need for you to VPN in and you can use your home network,” she said. “You want to be able to access your OneDrive, all your apps and email, you can do that today. You only VPN in because you’re trying to get to some shared drives that we’re going to shut down eventually anyway. So really, those are the things that we already have in play that we should take advantage of, especially now that we’re in a hybrid environment. As we move forward, yes, understanding the work that’s done, the hours required to do that work so that we can make better investment decisions about the technology that we want to use, so I do think there’s a connection between technology and people hours.”

Additionally, Goodwine said the Air Force will expand its “Desktop Anywhere” initiative beyond just the Air Force Reserve Command.

“It now has an Impact Level 5 authority to operate, and we’re going to move it [off-premise] so we’re expanding that. We’ll have the ability to do more of these virtualized environments,” she said. “From a cybersecurity perspective, it’s a great idea because I just reduced my attack surface and from a productivity perspective, it’s absolutely faster, better, cheaper, and it now really allows you to be mobile, which is what I want my workforce to be the airmen and guardians.”

The post Air Force increasing cloud capabilities for the warfighter first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/05/air-force-expanding-cloud-as-operational-tactical-lines-blur/feed/ 0
A new push by OMB to get a handle on 10,000 federal websites https://federalnewsnetwork.com/ask-the-cio/2024/05/a-new-push-by-omb-to-get-a-handle-on-10000-federal-websites/ https://federalnewsnetwork.com/ask-the-cio/2024/05/a-new-push-by-omb-to-get-a-handle-on-10000-federal-websites/#respond Fri, 03 May 2024 13:24:54 +0000 https://federalnewsnetwork.com/?p=4986975 OMB, GSA and the USDS are providing the policies, tools and know-how to help agencies improve digital services like federal websites more quickly.

The post A new push by OMB to get a handle on 10,000 federal websites first appeared on Federal News Network.

]]>
var config_4987097 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2793269166.mp3?updated=1714743034"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"A new push by OMB to get a handle on 10,000 websites","description":"[hbidcpodcast podcastid='4987097']nnOver the last six months, agencies inventoried over 10,000 public-facing federal websites and identified their top websites with the most user traffic.nnThis may have been the first time agencies completed such a website inventory as it was part of the requirements under the Digital Experience (DX) memo from Office of Management and Budget released in September.nnClare Martorana, the federal chief information officer, said over the last few decades as agencies have launched federal websites or web pages, it wasn\u2019t always based on standards or even using a .gov domain.nn[caption id="attachment_4779180" align="alignleft" width="400"]<img class="wp-image-4779180" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/11\/52907465469_6045511ab5_o-300x200.jpg" alt="Clare Martorana" width="400" height="267" \/> Federal CIO Clare Martorana.[\/caption]nnBut now with the inventory and the strong encouragement in the <a href="https:\/\/federalnewsnetwork.com\/it-modernization\/2023\/09\/omb-gives-agencies-a-10-year-digital-services-transformation-framework\/">Digital Experience memo<\/a> to use the U.S. Web Design System standards, Martorana said agencies will have a better idea of their entire ecosystem and what they need to do to secure it and improve the user experience.nn\u201cPart of what we're working on with our agency partners is they're scanning those websites, they're understanding what are.gov domains. Oftentimes, agencies have .edu and sometimes they have .com sites. We're looking across this ecosystem, making sure that they have the tools in place to be able to do that work. Then we do talk to them, and share best practices,\u201d Martorana said in an interview with Federal News Network on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe stood up recently the DX Council, which is taking a lot of working groups that have already existed in government for many years, with really passionate federal employees that have been doing this work, bringing them together and then sharing some of these insights so that we can go on this journey together.\u201dnnMartorana said NASA, for example, used a new scanning tool to find more and more websites used by the public and their science and research communities.nn\u201cThey're sharing with the DX Council and the community, so that other agencies that might not be quite as far along are able to learn from that and talk to those people, figure out what tools that they've used, and then that'll benefit the other agencies as we move forward,\u201d she said.nnThe DX Council, which launched in February, includes a Digital Experience Delivery Lead from each agency. The council serves as the primary interagency advisory body for assisting in the governmentwide implementation of the 21st Century IDEA and related digital experience activities.n<h2>Driving accessibility through standards<\/h2>nOne of the council\u2019s focuses is helping agencies take more advantage of the U.S. Web Design System standards.nnRobin Carnahan, the administrator of the General Services Administration, said agencies using the design standards average about 1 billion page views a month.nn\u201cWhat that does is make sure government websites are accessible to everyone,\u201d Carnahan said. \u201cOur job is to come up with a system that is easy to adopt and integrate into their existing functionality. I think that this has been around long enough and proven out enough that folks are ready to say, \u2018yes.\u2019 But there's still more to do. We've got a big percentage using the system, and now I'm encouraged the state governments are using it, local governments are using it. I'd encourage international folks to take a look at it, too. It's all open source and reusable. In fact, like, we ought to do more of that.\u201dnnCarnahan said the continued growth of Login.gov is another example of the sharing of tools and capabilities to make digital services better and easier to use. She said about 40 agencies and 50 million active users are taking advantage of the shared service. GSA announced in April it <a href="https:\/\/federalnewsnetwork.com\/it-modernization\/2024\/04\/gsa-to-pilot-facial-recognition-option-for-login-gov-in-may\/">would start a pilot in May<\/a> to allow individuals to verify their identity online using facial recognition technology that meets standards set by the National Institute of Standards and Technology\u2019s 800-63-3 Identity Assurance Level 2 (IAL2) guidelines.nn\u201cWe want to continue to build on Login.gov, and expand the use of that, expand our ability to do facial matching because that's the thing that many of our customers are looking for, in a way that's equitable, and in line with our values as a country,\u201d she said.nnThe policies from OMB and <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2023\/11\/american-rescue-plan-act-funding-continues-to-pay-dividends-for-gsas-tts\/">tools from GSA<\/a> all come together when the U.S. Digital Service helps agencies modernize applications and services.n<h2>USDS, SSA case study<\/h2>nMina Hsiang, the administrator of the USDS at OMB, said her team has worked on 20 projects across a dozen agencies over the last few years.nnShe said a project USDS worked on with the Social Security Administration to <a href="https:\/\/federalnewsnetwork.com\/federal-insights\/2024\/04\/2024-cx-exchange-ssas-martin-omalley-on-creating-better-experiences-for-the-masses\/">improve the agency\u2019s website<\/a> is good example of all of the tools, policies and processes coming together.nn\u201cThey didn't have a ton of monitoring on the back end to figure out what are people actually doing. What pages are they stuck on? Where transactions fill up? We helped them implement screeners for example. So some people can get their new social security card online, some people cannot; you could go through the entire process of trying to apply for it only to discover that you are not eligible for it. So we helped them to understand what are the high volume transactions, to make it easier upfront for someone to have an early interaction that helps them understand [if they are eligible],\u201d Hsiang said. \u201cWe did that using infrastructure that came from GSA, and helping work with the security team to identify what are the top priority transactions that we need to simplify.\u201dnnUSDS helped SSA implement US web design standards so citizens found the website language easier to understand. She said her team also put in place monitoring tools to create a continuous feedback loop to continue to improve the user\u2019s experience.nnMartorana said successes like SSA and USDS are helping to show both what can and should be done, and there are tools and help out there.nn\u201cWe have to make sure these teams have all of those capabilities available to them. It's really important in the 21st century, as we're working with these modern tools and trying to meet customers' expectations, we are utilizing products and services that are instant, accessible and trusted. So we are working really hard together as a team to make sure that we're meeting that mark,\u201d said Martorana, who also recently issued a<a href="https:\/\/www.whitehouse.gov\/omb\/briefing-room\/2024\/04\/17\/progress-towards-delivering-a-digital-first-public-experience\/" target="_blank" rel="noopener"> six-month update<\/a> on the progress against the digital experience memo.nnHsiang added the opportunity for large-scale transformation is here for every agency ranging from the IRS to the Centers for Disease Control and Prevention to the Department of Veterans Affairs, and every other agency.nn\u201cWe try and focus on the projects that can have the largest impact for our investment in them. Some of that is about the criticality of the service for individuals and their circumstances, and some of that is about the longevity of the change,\u201d she said.nn "}};

Over the last six months, agencies inventoried over 10,000 public-facing federal websites and identified their top websites with the most user traffic.

This may have been the first time agencies completed such a website inventory as it was part of the requirements under the Digital Experience (DX) memo from Office of Management and Budget released in September.

Clare Martorana, the federal chief information officer, said over the last few decades as agencies have launched federal websites or web pages, it wasn’t always based on standards or even using a .gov domain.

Clare Martorana
Federal CIO Clare Martorana.

But now with the inventory and the strong encouragement in the Digital Experience memo to use the U.S. Web Design System standards, Martorana said agencies will have a better idea of their entire ecosystem and what they need to do to secure it and improve the user experience.

“Part of what we’re working on with our agency partners is they’re scanning those websites, they’re understanding what are.gov domains. Oftentimes, agencies have .edu and sometimes they have .com sites. We’re looking across this ecosystem, making sure that they have the tools in place to be able to do that work. Then we do talk to them, and share best practices,” Martorana said in an interview with Federal News Network on Ask the CIO. “We stood up recently the DX Council, which is taking a lot of working groups that have already existed in government for many years, with really passionate federal employees that have been doing this work, bringing them together and then sharing some of these insights so that we can go on this journey together.”

Martorana said NASA, for example, used a new scanning tool to find more and more websites used by the public and their science and research communities.

“They’re sharing with the DX Council and the community, so that other agencies that might not be quite as far along are able to learn from that and talk to those people, figure out what tools that they’ve used, and then that’ll benefit the other agencies as we move forward,” she said.

The DX Council, which launched in February, includes a Digital Experience Delivery Lead from each agency. The council serves as the primary interagency advisory body for assisting in the governmentwide implementation of the 21st Century IDEA and related digital experience activities.

Driving accessibility through standards

One of the council’s focuses is helping agencies take more advantage of the U.S. Web Design System standards.

Robin Carnahan, the administrator of the General Services Administration, said agencies using the design standards average about 1 billion page views a month.

“What that does is make sure government websites are accessible to everyone,” Carnahan said. “Our job is to come up with a system that is easy to adopt and integrate into their existing functionality. I think that this has been around long enough and proven out enough that folks are ready to say, ‘yes.’ But there’s still more to do. We’ve got a big percentage using the system, and now I’m encouraged the state governments are using it, local governments are using it. I’d encourage international folks to take a look at it, too. It’s all open source and reusable. In fact, like, we ought to do more of that.”

Carnahan said the continued growth of Login.gov is another example of the sharing of tools and capabilities to make digital services better and easier to use. She said about 40 agencies and 50 million active users are taking advantage of the shared service. GSA announced in April it would start a pilot in May to allow individuals to verify their identity online using facial recognition technology that meets standards set by the National Institute of Standards and Technology’s 800-63-3 Identity Assurance Level 2 (IAL2) guidelines.

“We want to continue to build on Login.gov, and expand the use of that, expand our ability to do facial matching because that’s the thing that many of our customers are looking for, in a way that’s equitable, and in line with our values as a country,” she said.

The policies from OMB and tools from GSA all come together when the U.S. Digital Service helps agencies modernize applications and services.

USDS, SSA case study

Mina Hsiang, the administrator of the USDS at OMB, said her team has worked on 20 projects across a dozen agencies over the last few years.

She said a project USDS worked on with the Social Security Administration to improve the agency’s website is good example of all of the tools, policies and processes coming together.

“They didn’t have a ton of monitoring on the back end to figure out what are people actually doing. What pages are they stuck on? Where transactions fill up? We helped them implement screeners for example. So some people can get their new social security card online, some people cannot; you could go through the entire process of trying to apply for it only to discover that you are not eligible for it. So we helped them to understand what are the high volume transactions, to make it easier upfront for someone to have an early interaction that helps them understand [if they are eligible],” Hsiang said. “We did that using infrastructure that came from GSA, and helping work with the security team to identify what are the top priority transactions that we need to simplify.”

USDS helped SSA implement US web design standards so citizens found the website language easier to understand. She said her team also put in place monitoring tools to create a continuous feedback loop to continue to improve the user’s experience.

Martorana said successes like SSA and USDS are helping to show both what can and should be done, and there are tools and help out there.

“We have to make sure these teams have all of those capabilities available to them. It’s really important in the 21st century, as we’re working with these modern tools and trying to meet customers’ expectations, we are utilizing products and services that are instant, accessible and trusted. So we are working really hard together as a team to make sure that we’re meeting that mark,” said Martorana, who also recently issued a six-month update on the progress against the digital experience memo.

Hsiang added the opportunity for large-scale transformation is here for every agency ranging from the IRS to the Centers for Disease Control and Prevention to the Department of Veterans Affairs, and every other agency.

“We try and focus on the projects that can have the largest impact for our investment in them. Some of that is about the criticality of the service for individuals and their circumstances, and some of that is about the longevity of the change,” she said.

 

The post A new push by OMB to get a handle on 10,000 federal websites first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/05/a-new-push-by-omb-to-get-a-handle-on-10000-federal-websites/feed/ 0
Agency enhanced decision making through risk appetite lens https://federalnewsnetwork.com/ask-the-cio/2024/04/agency-enhance-decision-making-through-risk-appetite-lens/ https://federalnewsnetwork.com/ask-the-cio/2024/04/agency-enhance-decision-making-through-risk-appetite-lens/#respond Mon, 29 Apr 2024 14:16:30 +0000 https://federalnewsnetwork.com/?p=4980714 The Association for Federal Enterprise Risk Management (AFERM) survey showed 66% of all respondents said their ERM program is led by a chief risk officer.

The post Agency enhanced decision making through risk appetite lens first appeared on Federal News Network.

]]>
var config_4980772 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB5844707320.mp3?updated=1714399048"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"Agency enhance decision making through risk appetite lens","description":"[hbidcpodcast podcastid='4980772']nnAgencies are required to use enterprise risk management approaches in their planning and budgeting. Despite the eight-year-old mandate in Circular A-123 from the Office of Management and Budget, a new survey highlights why certain agencies are more successful than others.nnJason Bruno, the director of the Office of Strategic Oversight and Performance and chief risk officer at the Department of the Interior\u2019s Bureau of Trust Funds Administration and the president of the Association for Federal Enterprise Risk Management (AFERM), said it\u2019s clearer than ever what it takes to manage agency risks at the enterprise level.nn\u201cWhat we found was that organizations that incorporated ERM, or risk management, into the performance plans for their Senior Executive Service members (SES) or equivalents were among the highest performing scores. Along with that were organizations where the ERM program lead reported directly to the agency head or the deputy of the agency,\u201d Bruno said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cSo what that tells me is that where agencies are taking proactive measures to take ERM seriously, it's working, and those agencies are performing well.\u201dnnThe <a href="https:\/\/analytics.guidehouse.com\/ERM\/ERMSurvey2023.html" target="_blank" rel="noopener">2023 AFERM survey<\/a> showed 84% of respondents said their organization has an ERM program and 66% said their program has been in place for at least five years. Additionally, the survey showed a 20% increase to 66% of all respondents who said their ERM program is led by a chief risk officer.nnBruno said with each year of the survey, now in its sixth, AFERM sees more money and resources on ERM, including 38% who say they have at least 11 people work on risk management on a full-time basis.nn\u201cWe're seeing that organizations now are reporting that their ERM programs have been established for longer periods of time, and we're finding that the ERM program leads generally are spending at least 50% of their time on ERM activities, rather than ERM being one of many activities that the lead is responsible for,\u201d he said. \u201cIt's having benefits. Those organizations where the ERM spends at least 50% of their time directly on ERM activities are scoring much higher than those who don't.\u201dnn[caption id="attachment_4980719" align="aligncenter" width="1563"]<img class="wp-image-4980719 size-full" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/04\/aferm-survey-2023-chart-1.jpg" alt="" width="1563" height="642" \/> Source: AFERM\/Guidehouse 2023 survey.[\/caption]nnKate Sylvis, an enterprise risk management practice leader at Guidehouse, which supported the AFERM survey, said the survey results tell a good news story about ERM in many ways.nnShe said with 83% of the survey respondents having an ERM program in place over three years, it\u2019s clear the foundational aspects of this methodology are in place. Sylvis said that means agencies have the opportunity now to do some of the harder things with ERM.nn\u201cThose are the types of things like integrating their ERM programs and ERM thought with all of their management processes, and having conversations about risk appetite, or how much risk they're willing to take or trade off as they're trying to pursue their objectives,\u201d she said. \u201cIf you look at the four markers of integration with management processes, the means for those four processes have been going up every year. This year, only one of those processes reverted back a little bit, but not much that it\u2019s statistically significant. So we see that trend of increasing integration across all the management processes. We saw an increase in the number of organizations that have either implemented a risk appetite statement, have used that risk appetite statement or are considering using a risk appetite statement. And that is, that's a big deal.\u201dnn[caption id="attachment_4980721" align="aligncenter" width="1553"]<img class="wp-image-4980721 size-full" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/04\/aferm-survey-2023-chart-3.jpg" alt="" width="1553" height="473" \/> Source: AFERM\/Guidehouse 2023 survey.[\/caption]nnSylvis said understanding and applying a \u201crisk appetite\u201d to decision making is a more advanced concept for many organizations. She said more agencies are able to quantify risk and decide how much to accept demonstrates the continued maturity of ERM.nnThe survey found that more than 90% of the respondents indicated that their agency updated its risk appetite statement within the last three years, and over 60% of the respondents indicated that their programs plan to increase their focus on risk appetite over the next 12 months.nnBruno said it\u2019s clear that across the government agencies are putting a higher focus on risk appetite statements as a way to communicate these challenges throughout the organizations and as a way to integrate them into their strategy development and decision making processes.nn\u201cWhat that shows is a maturation of ERM programs and how seriously people take it. We're talking about mitigation strategies for risks,\u201d he said. \u201cIn my organization, we formalized our ERM program probably about five or six years ago by developing a risk governance structure, a senior risk council and a senior assessment team that are there to assess risks. As we have matured, we've gone from just having a risk register that lists all the risk and a risk profile that talks about the treatments or the mitigations of the risk to really incorporating the concepts of what to do about those real world risks with the conversations about risk appetite.\u201dnnSylvis added for a lot of organizations in both the public and private sectors this means moving from a theoretical to an intellectual exercise. More mature agencies are aligning their strategic objectives through their mission and customers and all underlying risks associated with those areas.nnShe said organizations are asking a lot more questions around:n<ul>n \t<li>How much risk am I willing to take to achieve that strategy?<\/li>n \t<li>What do we have to do in order to achieve the value and the outcomes that those programs offer and how much risk do we have to take to do that?<\/li>n \t<li>What are the tradeoffs here if I'm trying to achieve this strategic objective?<\/li>n \t<li>What does that look like from a resource trade off, whether or not that's dollars or people?<\/li>n \t<li>Do we take a risk-based approach to compliance so we take a little more risk of compliance on the front end, change some of our compliance requirements on the back end to give ourselves some capacity to allow us to achieve an objective in a different way, using some of those resources?<\/li>n<\/ul>n\u201cI can move those conversations around that strategic objective around that risk appetite and what I\u2019m willing to take on that really hits at the business and the mission that the senior executives like, they can dig their teeth into those because it makes sense and it's a real decision making conversation,\u201d Sylvis said.nn[caption id="attachment_4980722" align="aligncenter" width="1558"]<img class="wp-image-4980722 size-full" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/04\/aferm-survey-2023-chart-4.jpg" alt="" width="1558" height="778" \/> Source: AFERM\/Guidehouse 2023 survey.[\/caption]nnSylvis said the survey results continue to demonstrate where the challenges to ERM continue to lie.nnOne area is the continued gap between the perception of risk and what management is doing about risk. She said many times the perception of risk is much lower than what management is doing about those specific risks, which tend to fall into one of five categories: business continuity, financial risk, compliance, reporting and fraud.nn\u201cThat mismatch is an opportunity for organizations to say, \u2018Am I over controlling this? Do these activities that we're taking are they more than my residual risk appetite actually is for these particular risk types?\u2019 I think those are questions that we need to ask ourselves and not just a business processes as usual, particularly as we look at budget constraints that are going to become more and more prevalent,\u201d she said. \u201cWe have to look at programs and say, \u2018Is there a way that we can maintain this program with the same risk profile? Or should my risk profile for this program be different so that I can release capacity, move resources to another area that needs them, so that our agencies can really build resilience in and be able to function in the event of more budget cuts, which I do think is a possibility to come?\u2019 These five risk areas are the start of that, and I think building the capability to look at our processes from that perspective is something that will benefit us as a community down the line.\u201dnnBruno said while the survey showed continued maturation of ERM programs, the pace remains slower than it needs to be. He said more agencies would benefit from formally designating a specific person as a chief risk officer versus dual-hatting that responsibility with someone in the CFO or other office.nn\u201cI get a lot of calls from people and agencies who are saying, \u2018I'd like to get an ERM program off the ground. Do you have some recommendations for me? Can you walk through how you created your organization?\u2019 I just talked to someone who wanted to get their ERM program off the ground, has only about 30 employees and it\u2019s really hard to have a dedicated ERM or CRO professional who only does CRM when your entire office is only 30 people,\u201d he said. \u201cIn situations like that, one of the things about ERM that I really like is the ability of risk leaders to change things up on the fly. One of the things that I tell the risk management analysts under me and my office is that going by the book is fine, and it's a great start. But you got to break away from the by the book implementation of ERM because some organizations have 30,000 people like the CRO for the Department of Interior and some are much smaller. So the way you perform risk management is quite different from the way the CRO or the Department of Interior performance risk management.\u201d"}};

Agencies are required to use enterprise risk management approaches in their planning and budgeting. Despite the eight-year-old mandate in Circular A-123 from the Office of Management and Budget, a new survey highlights why certain agencies are more successful than others.

Jason Bruno, the director of the Office of Strategic Oversight and Performance and chief risk officer at the Department of the Interior’s Bureau of Trust Funds Administration and the president of the Association for Federal Enterprise Risk Management (AFERM), said it’s clearer than ever what it takes to manage agency risks at the enterprise level.

“What we found was that organizations that incorporated ERM, or risk management, into the performance plans for their Senior Executive Service members (SES) or equivalents were among the highest performing scores. Along with that were organizations where the ERM program lead reported directly to the agency head or the deputy of the agency,” Bruno said on Ask the CIO. “So what that tells me is that where agencies are taking proactive measures to take ERM seriously, it’s working, and those agencies are performing well.”

The 2023 AFERM survey showed 84% of respondents said their organization has an ERM program and 66% said their program has been in place for at least five years. Additionally, the survey showed a 20% increase to 66% of all respondents who said their ERM program is led by a chief risk officer.

Bruno said with each year of the survey, now in its sixth, AFERM sees more money and resources on ERM, including 38% who say they have at least 11 people work on risk management on a full-time basis.

“We’re seeing that organizations now are reporting that their ERM programs have been established for longer periods of time, and we’re finding that the ERM program leads generally are spending at least 50% of their time on ERM activities, rather than ERM being one of many activities that the lead is responsible for,” he said. “It’s having benefits. Those organizations where the ERM spends at least 50% of their time directly on ERM activities are scoring much higher than those who don’t.”

Source: AFERM/Guidehouse 2023 survey.

Kate Sylvis, an enterprise risk management practice leader at Guidehouse, which supported the AFERM survey, said the survey results tell a good news story about ERM in many ways.

She said with 83% of the survey respondents having an ERM program in place over three years, it’s clear the foundational aspects of this methodology are in place. Sylvis said that means agencies have the opportunity now to do some of the harder things with ERM.

“Those are the types of things like integrating their ERM programs and ERM thought with all of their management processes, and having conversations about risk appetite, or how much risk they’re willing to take or trade off as they’re trying to pursue their objectives,” she said. “If you look at the four markers of integration with management processes, the means for those four processes have been going up every year. This year, only one of those processes reverted back a little bit, but not much that it’s statistically significant. So we see that trend of increasing integration across all the management processes. We saw an increase in the number of organizations that have either implemented a risk appetite statement, have used that risk appetite statement or are considering using a risk appetite statement. And that is, that’s a big deal.”

Source: AFERM/Guidehouse 2023 survey.

Sylvis said understanding and applying a “risk appetite” to decision making is a more advanced concept for many organizations. She said more agencies are able to quantify risk and decide how much to accept demonstrates the continued maturity of ERM.

The survey found that more than 90% of the respondents indicated that their agency updated its risk appetite statement within the last three years, and over 60% of the respondents indicated that their programs plan to increase their focus on risk appetite over the next 12 months.

Bruno said it’s clear that across the government agencies are putting a higher focus on risk appetite statements as a way to communicate these challenges throughout the organizations and as a way to integrate them into their strategy development and decision making processes.

“What that shows is a maturation of ERM programs and how seriously people take it. We’re talking about mitigation strategies for risks,” he said. “In my organization, we formalized our ERM program probably about five or six years ago by developing a risk governance structure, a senior risk council and a senior assessment team that are there to assess risks. As we have matured, we’ve gone from just having a risk register that lists all the risk and a risk profile that talks about the treatments or the mitigations of the risk to really incorporating the concepts of what to do about those real world risks with the conversations about risk appetite.”

Sylvis added for a lot of organizations in both the public and private sectors this means moving from a theoretical to an intellectual exercise. More mature agencies are aligning their strategic objectives through their mission and customers and all underlying risks associated with those areas.

She said organizations are asking a lot more questions around:

  • How much risk am I willing to take to achieve that strategy?
  • What do we have to do in order to achieve the value and the outcomes that those programs offer and how much risk do we have to take to do that?
  • What are the tradeoffs here if I’m trying to achieve this strategic objective?
  • What does that look like from a resource trade off, whether or not that’s dollars or people?
  • Do we take a risk-based approach to compliance so we take a little more risk of compliance on the front end, change some of our compliance requirements on the back end to give ourselves some capacity to allow us to achieve an objective in a different way, using some of those resources?

“I can move those conversations around that strategic objective around that risk appetite and what I’m willing to take on that really hits at the business and the mission that the senior executives like, they can dig their teeth into those because it makes sense and it’s a real decision making conversation,” Sylvis said.

Source: AFERM/Guidehouse 2023 survey.

Sylvis said the survey results continue to demonstrate where the challenges to ERM continue to lie.

One area is the continued gap between the perception of risk and what management is doing about risk. She said many times the perception of risk is much lower than what management is doing about those specific risks, which tend to fall into one of five categories: business continuity, financial risk, compliance, reporting and fraud.

“That mismatch is an opportunity for organizations to say, ‘Am I over controlling this? Do these activities that we’re taking are they more than my residual risk appetite actually is for these particular risk types?’ I think those are questions that we need to ask ourselves and not just a business processes as usual, particularly as we look at budget constraints that are going to become more and more prevalent,” she said. “We have to look at programs and say, ‘Is there a way that we can maintain this program with the same risk profile? Or should my risk profile for this program be different so that I can release capacity, move resources to another area that needs them, so that our agencies can really build resilience in and be able to function in the event of more budget cuts, which I do think is a possibility to come?’ These five risk areas are the start of that, and I think building the capability to look at our processes from that perspective is something that will benefit us as a community down the line.”

Bruno said while the survey showed continued maturation of ERM programs, the pace remains slower than it needs to be. He said more agencies would benefit from formally designating a specific person as a chief risk officer versus dual-hatting that responsibility with someone in the CFO or other office.

“I get a lot of calls from people and agencies who are saying, ‘I’d like to get an ERM program off the ground. Do you have some recommendations for me? Can you walk through how you created your organization?’ I just talked to someone who wanted to get their ERM program off the ground, has only about 30 employees and it’s really hard to have a dedicated ERM or CRO professional who only does CRM when your entire office is only 30 people,” he said. “In situations like that, one of the things about ERM that I really like is the ability of risk leaders to change things up on the fly. One of the things that I tell the risk management analysts under me and my office is that going by the book is fine, and it’s a great start. But you got to break away from the by the book implementation of ERM because some organizations have 30,000 people like the CRO for the Department of Interior and some are much smaller. So the way you perform risk management is quite different from the way the CRO or the Department of Interior performance risk management.”

The post Agency enhanced decision making through risk appetite lens first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/04/agency-enhance-decision-making-through-risk-appetite-lens/feed/ 0
FEMA’s cloud journey hitting uphill portion of marathon https://federalnewsnetwork.com/ask-the-cio/2024/04/femas-cloud-journey-hitting-uphill-portion-of-marathon/ https://federalnewsnetwork.com/ask-the-cio/2024/04/femas-cloud-journey-hitting-uphill-portion-of-marathon/#respond Fri, 19 Apr 2024 21:37:18 +0000 https://federalnewsnetwork.com/?p=4969795 Charlie Armstrong, the chief information officer at FEMA, said two recent successful migrations of applications to the cloud demonstrates progress.

The post FEMA’s cloud journey hitting uphill portion of marathon first appeared on Federal News Network.

]]>
The Federal Emergency Management Agency has about 60% of all workloads in the cloud.

Now the hard work really begins.

Charlie Armstrong, the chief information officer at FEMA, said he is pushing the agency to shut down more data centers and expand the number of applications in the cloud.

Charlie Armstrong is FEMA’s chief information officer.

“The engineering and technical pieces get harder and that may slow down our velocity a little bit. But the goal is to get everything migrated out of that data center so that we can start to decommission it and shut it down. That’s the primary work stream that we have going on,” Armstrong said on Ask the CIO. “In addition to that, in the September or October timeframe, we kicked off a small re-hosting effort, that’s actually pulling the covers back on some of these systems that are not planned to be modernized for a while, and maybe doing some database re-platforming, making them ready to be what we call cloud natives so that we can really leverage the value of cloud and be able to scale up and scale down as we need to.”

Over the long term, Armstrong, who joined the agency in February 2023, said he’d like to get FEMA out of the data center business as much as possible.

In the meantime, Armstrong said the migration of applications and workloads to the cloud will pick up steam in the coming year.

“We got off to what I would call a slow start, and mainly that was the complexity around the networking between our existing data center and our connections into the cloud service providers that we’re using. It took us some time to kind of make sure that we had the latency issues worked out so that we had the performance that was required in order to keep those as viable applications,” he said. “We started with development and test because we didn’t want to get the mission critical operational applications in a risky situation. We started to build some velocity up last spring, and as we continue to work out and work through some of the technology challenges, we had about 25% of workloads which we measured by virtual machines, were able to hit that milestone by the end of the fiscal year in September. And then the velocities continue to improve since then, and we’ve gotten up to the 60%.”

FEMA to re-platform systems

To reach that other 40% of applications, Armstrong said his team is taking a few different approaches. One is to re-platform workloads that aren’t ready to move to the cloud.

“Actually, our first application that we’re planning on re-platforming is our training system, which is low risk, but an important system because training is a key part of our mission and making sure that people are trained up, not just to do their day-to-day job, but to be able to respond to disasters, and incident management is really important to the agency,” he said. “It’s taken us a little bit of time to do some analysis around what it’s going to take to migrate that to a newer platform, and we’re working through that schedule of milestones now. Then, we’ve got some second and third order systems that we’re looking at that. I can’t go into the names of them for security reasons, but it would help us from a vetting of people and things like that.”

Armstrong said FEMA has a goal of re-platforming or re-factoring five or six systems by the end of the calendar year.

One big question FEMA still needs to answer is what to do with those systems that would cost a lot of time or money to re-platform, but eventually the agency will shut down when it modernizes the workload.

Armstrong said these are the trade-off decisions that his team is making based on the current state of the system, where it fits in the mission area and the return on investment from modernizing.

“What happens with the things that don’t get moved to cloud? Well, the goal is to get everything out of the existing data center. At a minimum, we would take that remaining amount and move them to some kind of a co-location facility so that we can actually decommission the data center that we have,” he said. “It’s more cost efficient to move to a co-location facility so that we can shut down an aging facility and not have to recapitalize things like the power and heating and cooling, and things like that.”

Cloud brokerage maturing

As FEMA continues in its cloud journey, the CIO office’s cloud brokerage will play a larger role in shepherding applications to the right cloud.

Armstrong said the office remains in its early stages, but is helping more and more mission areas with their cloud decisions.

“Obviously, we’re just like everybody else, we’re still learning our way into cloud. We’re still working on upskilling our workforce on being more cloud centric and savvy,” he said. “We’re working it as we mature on our cloud processes.”

Armstrong said FEMA had two recent cloud migration successes. One is moving 19 grant applications to the new FEMA Grants Outcomes, or FEMA GO, platform. The agency plans to shift approximately 20 more grant programs over to the platform this months. A second is FEMA’s National Flood Insurance Program (NFIP).

Armstrong said both reached full operational capability in the cloud at the end of March and now are finishing the data transition work.

“We struck out on a goal about a year and a half ago to get those applications migrated to cloud primarily through a lift and shift approach,” he said.

A third system that started its cloud journey is FEMA’s financial systems modernization effort. This has long been a challenge for all of the Homeland Security Department.

“I think getting some of our critical customer facing applications in the cloud would really hit the mark. There are some things that whether we get moved to cloud or not, and the next year probably won’t make a big shift in customer satisfaction and or our ability to hit mission. But if you look at areas like individual assistance, which is very much customer facing application because they’re providing assistance to that survivor at the point in time of a disaster, the ability to scale up and meet a surge capacity is something we always try to plan for, or I should say that the agency has tried to plan for the worst day,” he said. “So having a cloud service provider really provides a lot more room for surge capacity. When we talk about resilience, cloud is a key part of that because being able to surge up, being able to leverage different types of software-as-a-service through the cloud that we may need to opt in to on the fly in order to meet a special demand, all those are really reasons as to why we’re so adamant on getting to cloud. I don’t anticipate that there’s going to be some huge cost savings. At the end of the day, we do get to avoid some re-capitalization of the equipment that’s in the facility today. The way I see it, we’re going to get to more resilient data centers and be able to do things like geographic diversity, and have multiple points of entry through the departmental new cap programs.”

The post FEMA’s cloud journey hitting uphill portion of marathon first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/04/femas-cloud-journey-hitting-uphill-portion-of-marathon/feed/ 0
Army has burned the software development bridges behind them https://federalnewsnetwork.com/ask-the-cio/2024/04/army-has-burned-the-software-development-bridges-behind-them/ https://federalnewsnetwork.com/ask-the-cio/2024/04/army-has-burned-the-software-development-bridges-behind-them/#respond Mon, 15 Apr 2024 12:19:48 +0000 https://federalnewsnetwork.com/?p=4962829 Margaret Boatner, deputy assistant secretary of the Army for strategy and acquisition reform, said new approaches to buying software already are paying off.

The post Army has burned the software development bridges behind them first appeared on Federal News Network.

]]>
var config_4962847 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2683870030.mp3?updated=1713183115"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"Army has burned the software development bridges behind them","description":"[hbidcpodcast podcastid='4962847']nnThe Army has seen enough from its testing of the Adaptive Acquisition Framework to know what its future looks like.nnAnd that future is around six pathways that moves the services away from a one-size-fits all approach to buying and managing technology.nnMargaret Boatner, the deputy assistant secretary of the Army for strategy and acquisition reform, said these six pathways outlined in the Defense Department\u2019s <a href="https:\/\/federalnewsnetwork.com\/defense-industry\/2020\/09\/dod-takes-next-step-in-acquisition-reform-renews-calls-for-contractor-stimulus\/">released in 2020<\/a> have shown enough promise to force the service to change its approach to how it buys and develops software.nn[caption id="attachment_4434599" align="alignright" width="300"]<img class="size-medium wp-image-4434599" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/01\/margaret-boatner-e1673995409964-300x225.jpg" alt="" width="300" height="225" \/> Margaret Boatner, deputy assistant secretary of the Army for strategy and acquisition reform[\/caption]nn\u201cI think two really key things about the software pathway. One, it eliminates a lot of bureaucracy and process that is typical of the traditional acquisition process. For example, we can operate totally outside of the traditional requirements process. There's a lot less documentation and review requirements to start a program on the software pathway,\u201d Boatner said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cBut even more importantly, what it does do is it actually requires us to use modern software practices. It's not an option. We have to use agile, DevSecOps, continuous integration, continuous delivery (CI\/CD) and those types of things. We have to deliver capabilities within one year and annually thereafter. So really it\u2019s forcing faster cycle times then what we employ traditionally when you look at our software systems.\u201dnnShe added the Army typically released new features or capabilities every three to four years in traditional waterfall based programs. But now with the software pathway, it\u2019s accelerated the release of new services to every 9-to-12 months, at the very most with a goal of releasing every six months or sooner.nnWhile the Army is far from industry standards of releasing new capabilities every few weeks or even more quickly, she said these initial changes <a href="https:\/\/federalnewsnetwork.com\/acquisition\/2023\/04\/army-moves-forward-with-streamlining-software-acquisition\/">show real progress<\/a> in changing the culture and outcomes.nn\u201cI do think it's a sign that we have adapted and are adapting to more of these agile processes that are required by the software pathway. The Army has 14 programs on the pathway now, but we are actively in the process of transitioning more over including more of our traditional defense business systems because those are now allowed to transition and pivot over to the software pathway as well,\u201d she said. \u201cFor any of our software intensive capabilities, we want to get on the software pathway because of all of these unique flexibilities and the fact that it forces us to align to some of these industry best practices.\u201dnnDoD\u2019s <a href="https:\/\/aaf.dau.edu\/" target="_blank" rel="noopener">Adaptive Acquisition Framework<\/a> splits the acquisition approaches into six pathways:n<ul>n \t<li>Major capability acquisition (the pathway that will handle most of the military\u2019s traditional hardware procurements).<\/li>n \t<li>Urgent capabilities.<\/li>n \t<li>Software (including, in some cases, software that will be part of major weapons systems).<\/li>n \t<li>Business systems.<\/li>n \t<li>Services.<\/li>n \t<li>Middle-tier acquisitions that use the recently-enacted \u201cSection 804\u201d authority for rapid fielding and rapid prototyping.<\/li>n<\/ul>nUnder AAF, the objective is to flip the script: Start with a baseline of rules that only really matter for the pathway that best fits their program, and \u201ctailor-in\u201d whichever additional requirements and <a href="https:\/\/federalnewsnetwork.com\/on-dod\/2021\/03\/dods-new-adaptive-acquisition-framework-takes-new-approach-to-tailoring-procurement-strategies\/">acquisition best practices<\/a> fit the actual product or service they\u2019re buying or building.n<h2>Reducing documentation requirements<\/h2>nBoatner said the AAF is changing the Army\u2019s mindset in two main ways.nn\u201cFirst, it really allows us to tailor acquisition approaches. So moving away from the one size fits all approach, we now have the ability to choose between six different and distinct pathways based on the characteristics of our program,\u201d she said. \u201cWe could choose one or we could choose multiple pathways depending on the needs of our program. It also emphasizes tailoring-in versus tailoring-out of other requirements. So instead of having to comply with 35 documentation and review requirements for everything we get to say, \u2018hey, this document in this review requirement is appropriate based on the program.\u2019 The second thing that it does really is empower our program managers. We can delegate decisions down as much as possible, including the choice of the pathway, including what documents review requirements. It\u2019s really pushing down a lot of that decision making which helps to streamline the process.\u201dnnThe expanded use of the AAF also is part of a broader effort across the Army to change the way it manages and buys software.nnSecretary Christine Wormuth issued a\u00a0<a href="https:\/\/www.army.mil\/article\/274356\/army_announces_new_policy_to_drive_adoption_of_agile_software_development_practices">new agile software policy<\/a>\u00a0detailing five changes to reform what she called the institutional processes of the Army.nnLeo Garciga, the Army\u2019s chief information officer, <a href="https:\/\/federalnewsnetwork.com\/cloud-computing\/2024\/03\/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation\/">said recently<\/a> the policy changes will help the service streamline its ability to build contracts based on agile and DevSecOps methodologies.nnBoatner said these changes will not happen overnight, recognizing the Army has built up these habits and processes over the course of decades.nn\u201cWe'll do a full communication blitz, where we go to all of the program executive office shops and all of the contracting shops to make sure they understand the direction that we are moving in. We're also trying to centralize expertise in a couple of places. Contracting, for example, is one way that we're trying to centralize this expertise, such that contracts will flow through the same group of people who really can become very, very savvy in this, who are more skilled in in writing and executing contracts or agreements for software development efforts,\u201d she said. \u201cWe're trying to pool another group of experts that are going to help folks from the headquarters level as they do their software development efforts, more from the technical software development side. It\u2019s really making sure we have the right expertise in the right place to actually execute a lot of these things, in addition to all of the communication and the roadshows that we, of course, plan to do.\u201dnnShe added her office also will work with the larger cybersecurity and test and evaluation communities to ensure they understand how the AAF works and what it means for their specific areas.nn "}};

The Army has seen enough from its testing of the Adaptive Acquisition Framework to know what its future looks like.

And that future is around six pathways that moves the services away from a one-size-fits all approach to buying and managing technology.

Margaret Boatner, the deputy assistant secretary of the Army for strategy and acquisition reform, said these six pathways outlined in the Defense Department’s released in 2020 have shown enough promise to force the service to change its approach to how it buys and develops software.

Margaret Boatner, deputy assistant secretary of the Army for strategy and acquisition reform

“I think two really key things about the software pathway. One, it eliminates a lot of bureaucracy and process that is typical of the traditional acquisition process. For example, we can operate totally outside of the traditional requirements process. There’s a lot less documentation and review requirements to start a program on the software pathway,” Boatner said on Ask the CIO. “But even more importantly, what it does do is it actually requires us to use modern software practices. It’s not an option. We have to use agile, DevSecOps, continuous integration, continuous delivery (CI/CD) and those types of things. We have to deliver capabilities within one year and annually thereafter. So really it’s forcing faster cycle times then what we employ traditionally when you look at our software systems.”

She added the Army typically released new features or capabilities every three to four years in traditional waterfall based programs. But now with the software pathway, it’s accelerated the release of new services to every 9-to-12 months, at the very most with a goal of releasing every six months or sooner.

While the Army is far from industry standards of releasing new capabilities every few weeks or even more quickly, she said these initial changes show real progress in changing the culture and outcomes.

“I do think it’s a sign that we have adapted and are adapting to more of these agile processes that are required by the software pathway. The Army has 14 programs on the pathway now, but we are actively in the process of transitioning more over including more of our traditional defense business systems because those are now allowed to transition and pivot over to the software pathway as well,” she said. “For any of our software intensive capabilities, we want to get on the software pathway because of all of these unique flexibilities and the fact that it forces us to align to some of these industry best practices.”

DoD’s Adaptive Acquisition Framework splits the acquisition approaches into six pathways:

  • Major capability acquisition (the pathway that will handle most of the military’s traditional hardware procurements).
  • Urgent capabilities.
  • Software (including, in some cases, software that will be part of major weapons systems).
  • Business systems.
  • Services.
  • Middle-tier acquisitions that use the recently-enacted “Section 804” authority for rapid fielding and rapid prototyping.

Under AAF, the objective is to flip the script: Start with a baseline of rules that only really matter for the pathway that best fits their program, and “tailor-in” whichever additional requirements and acquisition best practices fit the actual product or service they’re buying or building.

Reducing documentation requirements

Boatner said the AAF is changing the Army’s mindset in two main ways.

“First, it really allows us to tailor acquisition approaches. So moving away from the one size fits all approach, we now have the ability to choose between six different and distinct pathways based on the characteristics of our program,” she said. “We could choose one or we could choose multiple pathways depending on the needs of our program. It also emphasizes tailoring-in versus tailoring-out of other requirements. So instead of having to comply with 35 documentation and review requirements for everything we get to say, ‘hey, this document in this review requirement is appropriate based on the program.’ The second thing that it does really is empower our program managers. We can delegate decisions down as much as possible, including the choice of the pathway, including what documents review requirements. It’s really pushing down a lot of that decision making which helps to streamline the process.”

The expanded use of the AAF also is part of a broader effort across the Army to change the way it manages and buys software.

Secretary Christine Wormuth issued a new agile software policy detailing five changes to reform what she called the institutional processes of the Army.

Leo Garciga, the Army’s chief information officer, said recently the policy changes will help the service streamline its ability to build contracts based on agile and DevSecOps methodologies.

Boatner said these changes will not happen overnight, recognizing the Army has built up these habits and processes over the course of decades.

“We’ll do a full communication blitz, where we go to all of the program executive office shops and all of the contracting shops to make sure they understand the direction that we are moving in. We’re also trying to centralize expertise in a couple of places. Contracting, for example, is one way that we’re trying to centralize this expertise, such that contracts will flow through the same group of people who really can become very, very savvy in this, who are more skilled in in writing and executing contracts or agreements for software development efforts,” she said. “We’re trying to pool another group of experts that are going to help folks from the headquarters level as they do their software development efforts, more from the technical software development side. It’s really making sure we have the right expertise in the right place to actually execute a lot of these things, in addition to all of the communication and the roadshows that we, of course, plan to do.”

She added her office also will work with the larger cybersecurity and test and evaluation communities to ensure they understand how the AAF works and what it means for their specific areas.

 

The post Army has burned the software development bridges behind them first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/04/army-has-burned-the-software-development-bridges-behind-them/feed/ 0
Why the principal cyber advisor ended up being a good thing https://federalnewsnetwork.com/ask-the-cio/2024/04/why-the-principal-cyber-advisor-ended-up-being-a-good-thing/ https://federalnewsnetwork.com/ask-the-cio/2024/04/why-the-principal-cyber-advisor-ended-up-being-a-good-thing/#respond Mon, 08 Apr 2024 13:44:42 +0000 https://federalnewsnetwork.com/?p=4954123 Chris Cleary, the former principal cyber advisor for the Navy, left in November after three years in the role and helped establish the value of his office.

The post Why the principal cyber advisor ended up being a good thing first appeared on Federal News Network.

]]>
var config_4954160 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB7503995626.mp3?updated=1712582876"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"Why the principal cyber advisor ended up being a good thing","description":"[hbidcpodcast podcastid='4954160']nnA few years ago, the Defense Department drafted a legislative proposal to get rid of principal cyber advisor positions across all services.nnWhile this idea didn\u2019t make it out of the Pentagon, three-plus years later, Chris Cleary, the <a href="https:\/\/federalnewsnetwork.com\/navy\/2023\/10\/navys-principal-cyber-advisors-3-year-term-to-end-in-november\/">former principal cyber advisor<\/a> for the Department of the Navy, said that was a good thing.nnCleary, who left government recently and <a href="https:\/\/www.mantech.com\/chris-cleary" target="_blank" rel="noopener">joined ManTech<\/a> as its vice president of its global cyber practice, said the impact of the principal cyber advisor in the Navy is clear and lasting.nn[caption id="attachment_1822945" align="alignright" width="400"]<img class="wp-image-1822945" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2017\/10\/Chris-Cleary-Federal-Insights-300x154.jpg" alt="" width="400" height="206" \/> Chris Cleary was the Department of the Navy\u2019s principal cyber advisor for three years before leaving late last year.[\/caption]nn\u201cThis is challenging because all the services in the very, very beginning wanted to get rid of the principal cyber advisors. There was a legislative proposition that was trying to be submitted and Congress came over the top and said, \u2018No, you're going to do this,\u201d Cleary said during an \u201cexit\u201d interview on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cSo year one in the job, I make the joke, I was just trying to avoid getting smothered by a pillow because no one wanted this position especially after we just stood up the re-empowered CIO office so what's a PCA? And what's this person going to do for the organization? I was very attuned to that and ready that if the decision is to push back on this creation, and maybe do away with the PCA job, I was just going to go back to being a chief information security officer. I was being a good sailor and focused on whatever are the best needs of the Navy. I was prepared to do that.\u201dnnThe move to get rid of the principal cyber advisors never came to fruition and, instead, the Navy, and likely other military services, now <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2021\/12\/cyber-advisors-start-to-see-momentum-within-military-services\/">see the value<\/a> in the position.n<h2>Cyber advisor wields budget influence<\/h2>nCleary said one way the principal cyber advisor continues to provide value is around budgeting for cybersecurity. He said each year his office submits a letter on the \u201cbudget adequacy\u201d to the Defense Department\u2019s planning process, called the Program Objective Memorandum (POM).nn\u201cI found that the PCA office really became the champion for advocating and supporting programs like More Situational Awareness for Industrial Control Systems (MOSAICS), which was a thing we were doing for operational technology systems ashore, and another product called Situational Awareness, Boundary Enforcement and Response (SABER), which was its cousin and for OT stuff afloat,\u201d he said. \u201cWhat you found is both of those programs are being championed by hardworking, honest Navy employees that just couldn't break squelch to get a properly resourced or funded or programmed for. The PCA was able to champion these things within the E-Ring of the Pentagon. Things like MOSAICS, as an example, I am very proud of, we worked very closely with the Assistant Secretary of the Navy for Energy, Installations and Environment, Meredith Berger. She very quickly recognized the problem, most of this fell kind of within her sphere of influence as the person responsible for resourcing all of the Navy's infrastructure. She very quickly embraced it, adopted it and hired an individual within the organization to look at this specifically.\u201dnnCleary said over the course of the next few years, he worked with Berger\u2019s team as well as other cyber experts in the Navy and across DoD to do deep dives into how to secure OT.nnWhen the Defense Department rolled out its <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2022\/11\/pentagon-releases-zero-trust-strategy-to-guide-dod-cybersecurity-priorities\/">zero trust strategy<\/a> in November 2022, the services faced more challenges around operational technology than typical IT. Cleary said the PCA helped the Navy better understand the OT stack was more complex and the tools used for IT wouldn\u2019t necessarily work.nn\u201cThe further you get down closer to an actual device or controller you can\u2019t just roll a firewall out against that,\u201d he said. \u201cThey have their own vulnerabilities and risks associated with them. But they're things that we haven't traditionally looked at when you when I'm talking about OT, like weapon systems, defense, critical infrastructure, these massive foundation of things that not only enable what we do from an enterprise IT standpoint, \u00a0but we\u2019ve got to keep the lights on and the water flowing, and the Aegis weapon system has lots of computers with it, but that isn't an enterprise IT system so who's looking at those, who's resourcing those, it's only been the last decade or so that we've seen a lot of these is legitimate target areas.\u201dn<h2>Champion of attention, resources<\/h2>nCleary said his office helped get the Navy to spend more money and resources on <a href="https:\/\/federalnewsnetwork.com\/navy\/2022\/12\/the-navy-lays-out-a-strategy-to-compete-and-win-in-cyberspace\/">protecting operational technology<\/a> because it wasn\u2019t always a top priority.nnThe OT example, Cleary said, is exactly why Congress created the PCA.nn\u201cWe didn't do any of the work to create these things. We just champion them appropriately and ensure they got the attention they deserved. And then ultimately, the resourcing required so they can be successful,\u201d he said.nnCleary said it was clear that after three-plus years as the principal cyber advisor for the Navy, the benefits outweighed any concerns.nnHe said with the cyber world becoming more convoluted and complex, the position helps connect dots that were previously difficult to bring together.nn\u201cI think Congress would come and ask a question and they would get 10 different answers from 10 different people. I'm not saying we got there. But the idea of the PCA was to get those 10 different answers from 10 different people and then try to consolidate that answer into something that made sense that we could agree upon and present that answer back to Congress,\u201d he said. \u201cI'm not going to say we fully succeeded there because there are a lot of ways around the PCA and the PCA offices, but I think as the offices get more and more established, organizations like Fleet Cyber Command for the Navy, the Naval Information Forces and others were seeing the benefit of the PCA\u2019s job to be the middleman and deal with the back and forth.\u201dn<h2>Continue to create trust<\/h2>nCleary said toward the end of his tenure, these and other offices, including the Marines cyber office, started to work even more closely with his office on these wide-ranging cyber challenges. He said the principal cyber advisor was slowly, but surely becoming the trusted cyber advisor initially imagined.nn\u201cI use the analogy of a fishing line, when you start pulling out a fishing line and you're not sure what the weight of the fishing line is, but if you break the line, it's over. So the trick was to pull on it with just the right amount of tension without risking or breaking it,\u201d he said. \u201cI knew the PCA office was something new and if the relationships with those organizations became tenuous, or were cut off because of the PCA coming in and say, \u2018Hey, you shall do this or that,\u2019 it wasn\u2019t going to work. The way I envisioned the role of PCA was not to tell anybody inside the organization how to operationalize their own environments. My whole job was to go to them and understand what it is they needed, based on their experience and their expertise, and then get them that. The more that I could be seen as a value and not here to check their homework and poke them in the eye about their readiness, the more successful I\u2019d be.\u201dnnCleary said for the principal cyber advisor to continue to be successful, they have to continue to establish trust, understand their role is personality driven and focus on getting the commands the money and resources they need to continue to improve their cyber readiness."}};

A few years ago, the Defense Department drafted a legislative proposal to get rid of principal cyber advisor positions across all services.

While this idea didn’t make it out of the Pentagon, three-plus years later, Chris Cleary, the former principal cyber advisor for the Department of the Navy, said that was a good thing.

Cleary, who left government recently and joined ManTech as its vice president of its global cyber practice, said the impact of the principal cyber advisor in the Navy is clear and lasting.

Chris Cleary was the Department of the Navy’s principal cyber advisor for three years before leaving late last year.

“This is challenging because all the services in the very, very beginning wanted to get rid of the principal cyber advisors. There was a legislative proposition that was trying to be submitted and Congress came over the top and said, ‘No, you’re going to do this,” Cleary said during an “exit” interview on Ask the CIO. “So year one in the job, I make the joke, I was just trying to avoid getting smothered by a pillow because no one wanted this position especially after we just stood up the re-empowered CIO office so what’s a PCA? And what’s this person going to do for the organization? I was very attuned to that and ready that if the decision is to push back on this creation, and maybe do away with the PCA job, I was just going to go back to being a chief information security officer. I was being a good sailor and focused on whatever are the best needs of the Navy. I was prepared to do that.”

The move to get rid of the principal cyber advisors never came to fruition and, instead, the Navy, and likely other military services, now see the value in the position.

Cyber advisor wields budget influence

Cleary said one way the principal cyber advisor continues to provide value is around budgeting for cybersecurity. He said each year his office submits a letter on the “budget adequacy” to the Defense Department’s planning process, called the Program Objective Memorandum (POM).

“I found that the PCA office really became the champion for advocating and supporting programs like More Situational Awareness for Industrial Control Systems (MOSAICS), which was a thing we were doing for operational technology systems ashore, and another product called Situational Awareness, Boundary Enforcement and Response (SABER), which was its cousin and for OT stuff afloat,” he said. “What you found is both of those programs are being championed by hardworking, honest Navy employees that just couldn’t break squelch to get a properly resourced or funded or programmed for. The PCA was able to champion these things within the E-Ring of the Pentagon. Things like MOSAICS, as an example, I am very proud of, we worked very closely with the Assistant Secretary of the Navy for Energy, Installations and Environment, Meredith Berger. She very quickly recognized the problem, most of this fell kind of within her sphere of influence as the person responsible for resourcing all of the Navy’s infrastructure. She very quickly embraced it, adopted it and hired an individual within the organization to look at this specifically.”

Cleary said over the course of the next few years, he worked with Berger’s team as well as other cyber experts in the Navy and across DoD to do deep dives into how to secure OT.

When the Defense Department rolled out its zero trust strategy in November 2022, the services faced more challenges around operational technology than typical IT. Cleary said the PCA helped the Navy better understand the OT stack was more complex and the tools used for IT wouldn’t necessarily work.

“The further you get down closer to an actual device or controller you can’t just roll a firewall out against that,” he said. “They have their own vulnerabilities and risks associated with them. But they’re things that we haven’t traditionally looked at when you when I’m talking about OT, like weapon systems, defense, critical infrastructure, these massive foundation of things that not only enable what we do from an enterprise IT standpoint,  but we’ve got to keep the lights on and the water flowing, and the Aegis weapon system has lots of computers with it, but that isn’t an enterprise IT system so who’s looking at those, who’s resourcing those, it’s only been the last decade or so that we’ve seen a lot of these is legitimate target areas.”

Champion of attention, resources

Cleary said his office helped get the Navy to spend more money and resources on protecting operational technology because it wasn’t always a top priority.

The OT example, Cleary said, is exactly why Congress created the PCA.

“We didn’t do any of the work to create these things. We just champion them appropriately and ensure they got the attention they deserved. And then ultimately, the resourcing required so they can be successful,” he said.

Cleary said it was clear that after three-plus years as the principal cyber advisor for the Navy, the benefits outweighed any concerns.

He said with the cyber world becoming more convoluted and complex, the position helps connect dots that were previously difficult to bring together.

“I think Congress would come and ask a question and they would get 10 different answers from 10 different people. I’m not saying we got there. But the idea of the PCA was to get those 10 different answers from 10 different people and then try to consolidate that answer into something that made sense that we could agree upon and present that answer back to Congress,” he said. “I’m not going to say we fully succeeded there because there are a lot of ways around the PCA and the PCA offices, but I think as the offices get more and more established, organizations like Fleet Cyber Command for the Navy, the Naval Information Forces and others were seeing the benefit of the PCA’s job to be the middleman and deal with the back and forth.”

Continue to create trust

Cleary said toward the end of his tenure, these and other offices, including the Marines cyber office, started to work even more closely with his office on these wide-ranging cyber challenges. He said the principal cyber advisor was slowly, but surely becoming the trusted cyber advisor initially imagined.

“I use the analogy of a fishing line, when you start pulling out a fishing line and you’re not sure what the weight of the fishing line is, but if you break the line, it’s over. So the trick was to pull on it with just the right amount of tension without risking or breaking it,” he said. “I knew the PCA office was something new and if the relationships with those organizations became tenuous, or were cut off because of the PCA coming in and say, ‘Hey, you shall do this or that,’ it wasn’t going to work. The way I envisioned the role of PCA was not to tell anybody inside the organization how to operationalize their own environments. My whole job was to go to them and understand what it is they needed, based on their experience and their expertise, and then get them that. The more that I could be seen as a value and not here to check their homework and poke them in the eye about their readiness, the more successful I’d be.”

Cleary said for the principal cyber advisor to continue to be successful, they have to continue to establish trust, understand their role is personality driven and focus on getting the commands the money and resources they need to continue to improve their cyber readiness.

The post Why the principal cyber advisor ended up being a good thing first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/04/why-the-principal-cyber-advisor-ended-up-being-a-good-thing/feed/ 0
Understanding the data is the first step for NIH, CMS to prepare for AI https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/ https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/#respond Fri, 29 Mar 2024 19:53:52 +0000 https://federalnewsnetwork.com/?p=4944463 NIH and CMS have several ongoing initiatives to ensure employees and their customers understand the data they are providing as AI and other tools gain traction.

The post Understanding the data is the first step for NIH, CMS to prepare for AI first appeared on Federal News Network.

]]>
var config_4944551 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3043668049.mp3?updated=1711741714"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"NIH, CMS finding a path to better data management","description":"[hbidcpodcast podcastid='4944551']nnThe National Institutes of Health\u2019s BioData Catalyst cloud platform is only just starting to take off despite it being nearly six years old.nnIt already holds nearly four petabytes of data and is preparing for a major expansion later this year as part of NIH\u2019s goal to democratize health research information.nnSweta Ladwa, the chief of the Scientific Solutions Delivery Branch at NIH, said the <a href="https:\/\/www.nhlbi.nih.gov\/science\/biodata-catalyst" target="_blank" rel="noopener">BioData Catalyst<\/a> provides access to clinical and genomic data already and the agency wants to add imaging and other data types in the next few months.nn[caption id="attachment_4944475" align="alignright" width="300"]<img class="size-medium wp-image-4944475" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/03\/sweta-ladwa-300x300.jpg" alt="" width="300" height="300" \/> Sweta Ladwa is the chief of the Scientific Solutions Delivery Branch at NIH.[\/caption]nn\u201cWe're really looking to provide a free and accessible resource to the research community to be able to really advance scientific outcomes and therapeutics, diagnostics to benefit the public health and outcomes of Americans and really people all over the world,\u201d Ladwa said during a recent panel discussion sponsored by AFCEA Bethesda, an excerpt of which ran on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cTo do this, it takes a lot of different skills, expertise and different entities. It's a partnership between a lot of different people to make this resource available to the community. We're also part of the <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2024\/02\/ai-data-exchange-state-depts-matthew-graviss-nihs-susan-gregurick-on-ai-as-force-multiplier\/">larger NIH data ecosystem<\/a>. We participate with other NIH institutes and centers that provide cloud resources.\u201dnnLawda said the expansion of new datasets to the BioData Catalyst platform means NIH also can <a href="https:\/\/federalnewsnetwork.com\/cloud-computing\/2023\/06\/cloud-exchange-2023-nihs-nick-weber-explains-how-strides-cloud-program-bridges-27-institutes\/">provide new tools<\/a> to help mine the information.nn\u201cFor imaging data, for example, we want to be able to leverage or build in tooling that's associated with machine learning because that's what imaging researchers are primarily looking to do is they're trying to process these images to gain insights. So tooling associated with machine learning, for example, is something we want to be part of the ecosystem which we're actively actually working to incorporate,\u201d she said. \u201cA lot of tooling is associated with data types, but it also could be workflows, pipelines or applications that help the researchers really meet their use cases. And those use cases are all over the place because there's just a wealth of data there. There's so much that can be done.\u201dnnFor NIH, the users in the research and academic communities are driving both the datasets and associated tools. Lawda said NIH is trying to make it easier for the communities to gain access.n<h2>NIH making cloud storage easier<\/h2>nThat is why cloud services have been and will continue to play an integral role in this big data platform and others.nn\u201cThe NIH in the Office of Data Science Strategy has been negotiating rates with cloud vendors, so that we can provide these cloud storage free of cost to the community and at a discounted rate to the institute. So even if folks are using the services for computational purposes, they're able to actually leverage and take benefit from the discounts that have been negotiated by the NIH with these cloud vendors,\u201d she said. \u201cWe're really happy to be working with multi-cloud vendors to be able to pass some savings on to really advanced science. We're really looking to continue that effort and expand the capabilities with some of the newer technologies that have been buzzing this year, like generative artificial intelligence and things like that, and really provide those resources back to the community to advance the science.\u201dnnLike NIH, the Centers for Medicare and Medicaid Services is spending a lot of time <a href="https:\/\/federalnewsnetwork.com\/workforce\/2024\/02\/hhh-takes-step-toward-goal-for-better-health-information-sharing\/">thinking about its data<\/a> and how to make it more useful for its customers.nnIn CMS\u2019s case, however, the data is around the federal healthcare marketplace and the tools to make citizens and agency employees more knowledgeable.nn[caption id="attachment_4944476" align="alignleft" width="300"]<img class="size-medium wp-image-4944476" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/03\/kate-wetherby-300x300.png" alt="" width="300" height="300" \/> Kate Wetherby is the acting director for the Marketplace Innovation and Technology Group at CMS.[\/caption]nn nn nn nn nn nn nn nn nn nn nn nnKate Wetherby, the acting director for the Marketplace Innovation and Technology Group at CMS, said the agency is reviewing all of its data sources and data streams to better understand what they have and make their websites and the user experience all work better.nn\u201cWe use that for performance analytics to make sure that while we are doing open enrollment and while we're doing insurance for people, that our systems are up and running and that there's access,\u201d she said. \u201cThe other thing is that we spend a lot of time using Google Analytics, using different types of testing fields, to make sure that the way that we're asking questions or how we're getting information from people makes a ton of sense.\u201dnnWetherby said her office works closely with both the business and policy offices to bring the data together and ensure its valuable.nn\u201cReally the problem is if you're not really understanding it at the point of time that you're getting it, in 10 years from now you're going to be like, \u2018why do I have this data?\u2019 So it's really being thoughtful about the data at the beginning, and then spending the time year-over-year to see if it's something you should still be holding or not,\u201d she said.nnUnderstanding the business, policy and technical aspects of the data becomes more important for CMS as it <a href="https:\/\/federalnewsnetwork.com\/automation\/2020\/10\/cms-untangles-its-data-infrastructure-to-enable-ai-powered-fraud-detection\/">moves more into AI<\/a>, including generative AI, chatbots and other tools.n<h2>CMS creating a data lake<\/h2>nWetherby said CMS must understand their data first before applying these tools.nn\u201cWe have to understand why we're asking those questions. What is the relationship between all of that data, and how we can we improve? What does the length of data look like because we have some data that's a little older and you've got to look at that and be like, does that really fit into the use cases and where we want to go with the future work?\u201d she said. \u201cWe\u2019ve spent a lot of time, at CMS as a whole, really thinking about our data, and how we're curating the data, how we know what that's used for because we all know data can be manipulated in any way that you want. We want it to be really clear. We want it to be really usable. Because when we start talking in the future, and we talk about generative AI, we talk about chatbots or we talk about predictive analytics, it is so easy for a computer if the data is not right, or if the questions aren't right, to really not get the outcome that you're looking for.\u201dnnWetherby added another key part of getting data right is for the user\u2019s experience and how CMS can share that data across the government.nnIn the buildup to using GenAI and other tools, CMS is creating a data lake to pull information from different centers and offices across the agency.nnWetherby said this way the agency can place the right governance and security around the data since it crosses several types including clinical and claims information."}};

The National Institutes of Health’s BioData Catalyst cloud platform is only just starting to take off despite it being nearly six years old.

It already holds nearly four petabytes of data and is preparing for a major expansion later this year as part of NIH’s goal to democratize health research information.

Sweta Ladwa, the chief of the Scientific Solutions Delivery Branch at NIH, said the BioData Catalyst provides access to clinical and genomic data already and the agency wants to add imaging and other data types in the next few months.

Sweta Ladwa is the chief of the Scientific Solutions Delivery Branch at NIH.

“We’re really looking to provide a free and accessible resource to the research community to be able to really advance scientific outcomes and therapeutics, diagnostics to benefit the public health and outcomes of Americans and really people all over the world,” Ladwa said during a recent panel discussion sponsored by AFCEA Bethesda, an excerpt of which ran on Ask the CIO. “To do this, it takes a lot of different skills, expertise and different entities. It’s a partnership between a lot of different people to make this resource available to the community. We’re also part of the larger NIH data ecosystem. We participate with other NIH institutes and centers that provide cloud resources.”

Lawda said the expansion of new datasets to the BioData Catalyst platform means NIH also can provide new tools to help mine the information.

“For imaging data, for example, we want to be able to leverage or build in tooling that’s associated with machine learning because that’s what imaging researchers are primarily looking to do is they’re trying to process these images to gain insights. So tooling associated with machine learning, for example, is something we want to be part of the ecosystem which we’re actively actually working to incorporate,” she said. “A lot of tooling is associated with data types, but it also could be workflows, pipelines or applications that help the researchers really meet their use cases. And those use cases are all over the place because there’s just a wealth of data there. There’s so much that can be done.”

For NIH, the users in the research and academic communities are driving both the datasets and associated tools. Lawda said NIH is trying to make it easier for the communities to gain access.

NIH making cloud storage easier

That is why cloud services have been and will continue to play an integral role in this big data platform and others.

“The NIH in the Office of Data Science Strategy has been negotiating rates with cloud vendors, so that we can provide these cloud storage free of cost to the community and at a discounted rate to the institute. So even if folks are using the services for computational purposes, they’re able to actually leverage and take benefit from the discounts that have been negotiated by the NIH with these cloud vendors,” she said. “We’re really happy to be working with multi-cloud vendors to be able to pass some savings on to really advanced science. We’re really looking to continue that effort and expand the capabilities with some of the newer technologies that have been buzzing this year, like generative artificial intelligence and things like that, and really provide those resources back to the community to advance the science.”

Like NIH, the Centers for Medicare and Medicaid Services is spending a lot of time thinking about its data and how to make it more useful for its customers.

In CMS’s case, however, the data is around the federal healthcare marketplace and the tools to make citizens and agency employees more knowledgeable.

Kate Wetherby is the acting director for the Marketplace Innovation and Technology Group at CMS.

 

 

 

 

 

 

 

 

 

 

 

Kate Wetherby, the acting director for the Marketplace Innovation and Technology Group at CMS, said the agency is reviewing all of its data sources and data streams to better understand what they have and make their websites and the user experience all work better.

“We use that for performance analytics to make sure that while we are doing open enrollment and while we’re doing insurance for people, that our systems are up and running and that there’s access,” she said. “The other thing is that we spend a lot of time using Google Analytics, using different types of testing fields, to make sure that the way that we’re asking questions or how we’re getting information from people makes a ton of sense.”

Wetherby said her office works closely with both the business and policy offices to bring the data together and ensure its valuable.

“Really the problem is if you’re not really understanding it at the point of time that you’re getting it, in 10 years from now you’re going to be like, ‘why do I have this data?’ So it’s really being thoughtful about the data at the beginning, and then spending the time year-over-year to see if it’s something you should still be holding or not,” she said.

Understanding the business, policy and technical aspects of the data becomes more important for CMS as it moves more into AI, including generative AI, chatbots and other tools.

CMS creating a data lake

Wetherby said CMS must understand their data first before applying these tools.

“We have to understand why we’re asking those questions. What is the relationship between all of that data, and how we can we improve? What does the length of data look like because we have some data that’s a little older and you’ve got to look at that and be like, does that really fit into the use cases and where we want to go with the future work?” she said. “We’ve spent a lot of time, at CMS as a whole, really thinking about our data, and how we’re curating the data, how we know what that’s used for because we all know data can be manipulated in any way that you want. We want it to be really clear. We want it to be really usable. Because when we start talking in the future, and we talk about generative AI, we talk about chatbots or we talk about predictive analytics, it is so easy for a computer if the data is not right, or if the questions aren’t right, to really not get the outcome that you’re looking for.”

Wetherby added another key part of getting data right is for the user’s experience and how CMS can share that data across the government.

In the buildup to using GenAI and other tools, CMS is creating a data lake to pull information from different centers and offices across the agency.

Wetherby said this way the agency can place the right governance and security around the data since it crosses several types including clinical and claims information.

The post Understanding the data is the first step for NIH, CMS to prepare for AI first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/feed/ 0
DoD’s approach to fix its computers is function over form https://federalnewsnetwork.com/ask-the-cio/2024/03/dods-approach-to-fix-its-computers-is-function-over-form/ https://federalnewsnetwork.com/ask-the-cio/2024/03/dods-approach-to-fix-its-computers-is-function-over-form/#respond Fri, 22 Mar 2024 15:43:29 +0000 https://federalnewsnetwork.com/?p=4935785 Leslie Beavers, the principal deputy CIO for DoD, said a key focus for the near future is to improve the warfighter’s experience in using IT.

The post DoD’s approach to fix its computers is function over form first appeared on Federal News Network.

]]>
var config_4935876 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB6687878411.mp3?updated=1711120408"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"DoD\u2019s approach to fix its computers is function over form","description":"[hbidcpodcast podcastid='4935876']nnA year after a scathing report from the Defense Business Board found general unhappiness with the user experience with technology across the Defense Department, the chief information officer\u2019s office is taking a simple approach to fix the computers.nnA big part of this effort came earlier this year when DoD\u2019s CIO created a customer experience office, led by <a href="https:\/\/dodcio.defense.gov\/About-DoD-CIO\/bios\/Kong\/" target="_blank" rel="noopener">Savanrith Kong<\/a>, who now serves as the senior advisor for the user experience (UX) portfolio management office (PfMO).nnLeslie Beavers, the principal deputy CIO for DoD, said the overarching philosophy behind this improved CX approach is putting the user and their mission first.nn[caption id="attachment_4542651" align="alignright" width="325"]<img class="wp-image-4542651 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/04\/leslie-beavers.jpg" alt="" width="325" height="406" \/> Leslie Beavers is the principal deputy CIO for the Defense Department.[\/caption]nn\u201cI always lead off with, it's got to be functional first. If it's so secure that we can't connect, we're going to go around it and that's not good,\u201d Beavers said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe have to be able to scale it. That's the other big challenge that we have in the department. Not just internally, but we have to be able to scale to international allies and partners into the commercial world. \u00a0Then the third piece is we have to be secure, and in this case, it's with the zero trust. It's tagging the people, tagging the data and doing the audit so that we know what's happening and we can identify intrusions.\u201dnnThe DoD CIO\u2019s office got the message multiple times about function over form when it comes to why the user\u2019s experience is so important.nnThe first time happened in the \u201cfix my computer\u201d post by Michael Kanaan, the director of operations for the Air Force \u2013 MIT Artificial Intelligence Accelerator in June 2022 that went viral.nnThe second moment of truth came from the Defense Business Board in February 2023. The DBB <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2023\/02\/it-user-experience-gets-low-grades-in-defense-business-board-study\/">released survey results<\/a> showing 80% of survey respondents rating their user experience as average or below average. Out of about 20,000 respondents, 48% rated their experience as \u201cworst,\u201d and 32% fell into the category of average.nnOver the last year, the DoD CIO\u2019s office has been addressing both process and technology.n<h2>DoD's holistic perspective<\/h2>nDoD CIO John Sherman said last summer that the idea is to bring some standardization to the refresh cycle across all of the military and ensure <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2023\/05\/dod-prioritizes-it-user-experience\/">user experience<\/a> is a part of every technology initiative.nnBeavers said now that Kang is on board, he is shaping DoD\u2019s user experience effort.nn\u201cWe're looking at it from a holistic perspective because user experience is more than just having the latest equipment. It is all around the functionality and in the department, it's different than in the commercial world,\u201d she said. \u201cIf you think about an F-35, it's a flying interoperable networked computer with the pilot. So the user experience is from the warfighters\u2019 perspective. But whether you're sitting in an operations room or behind a desk or out in a plane or on a ship, does your IT and your communications equipment work together and can you stay secure? The department is also standing up a big effort to get after the IT for the warfighter.\u201dnnThrough this initiative, Beavers said the challenges are much different, ranging from a huge install base to legacy technology not designed to be interoperable and a limited budget.nnAt same time, Beavers said there\u2019s a lot of opportunity to make some improvements to the user experience.nn\u201cWe should make a concerted effort to look at where our policies are standing in the way of the interoperability. Where do we need an engineering solution? And where do we need just a process change?\u201d she said. \u201cThe department is really pretty good at buying big things over long periods of time and buying quick things and bringing them when there's an imperative like a war. But it's not ingrained as part of the standard operating procedure in the department as much as we would like so we're working on building that piece out, to help bring in the new technology and also to improve the customer experience.\u201dn<h2>DoD, VA collaboration<\/h2>nBeavers added DoD is using the Lean Six Sigma business process improvement approach to help sort through the potential changes and to better understand the broader impacts of process and policy revisions.nnSome recent work with the Veterans Affairs Department is a customer experience win, Beavers said.nnAt the North Chicago Veterans Medical Center, VA and DoD staff have worked closely together for the past decade or more. But their systems and networks were separate and data sharing was basically non-existent.nnShe said in some cases, it would take around 36 mouse clicks to send an email between the DOD and the VA.nn\u201cWe spent the last six months pivoting to Office 365 in the cloud and turning on some business functionality,\u201d Beavers said. \u201cThis really was a cooperation problem where the security folks on both sides had to decide to configure the clouds the same way to enable that interoperability. We are rolling that out now to the people working in less than six months.\u201d"}};

A year after a scathing report from the Defense Business Board found general unhappiness with the user experience with technology across the Defense Department, the chief information officer’s office is taking a simple approach to fix the computers.

A big part of this effort came earlier this year when DoD’s CIO created a customer experience office, led by Savanrith Kong, who now serves as the senior advisor for the user experience (UX) portfolio management office (PfMO).

Leslie Beavers, the principal deputy CIO for DoD, said the overarching philosophy behind this improved CX approach is putting the user and their mission first.

Leslie Beavers is the principal deputy CIO for the Defense Department.

“I always lead off with, it’s got to be functional first. If it’s so secure that we can’t connect, we’re going to go around it and that’s not good,” Beavers said on Ask the CIO. “We have to be able to scale it. That’s the other big challenge that we have in the department. Not just internally, but we have to be able to scale to international allies and partners into the commercial world.  Then the third piece is we have to be secure, and in this case, it’s with the zero trust. It’s tagging the people, tagging the data and doing the audit so that we know what’s happening and we can identify intrusions.”

The DoD CIO’s office got the message multiple times about function over form when it comes to why the user’s experience is so important.

The first time happened in the “fix my computer” post by Michael Kanaan, the director of operations for the Air Force – MIT Artificial Intelligence Accelerator in June 2022 that went viral.

The second moment of truth came from the Defense Business Board in February 2023. The DBB released survey results showing 80% of survey respondents rating their user experience as average or below average. Out of about 20,000 respondents, 48% rated their experience as “worst,” and 32% fell into the category of average.

Over the last year, the DoD CIO’s office has been addressing both process and technology.

DoD’s holistic perspective

DoD CIO John Sherman said last summer that the idea is to bring some standardization to the refresh cycle across all of the military and ensure user experience is a part of every technology initiative.

Beavers said now that Kang is on board, he is shaping DoD’s user experience effort.

“We’re looking at it from a holistic perspective because user experience is more than just having the latest equipment. It is all around the functionality and in the department, it’s different than in the commercial world,” she said. “If you think about an F-35, it’s a flying interoperable networked computer with the pilot. So the user experience is from the warfighters’ perspective. But whether you’re sitting in an operations room or behind a desk or out in a plane or on a ship, does your IT and your communications equipment work together and can you stay secure? The department is also standing up a big effort to get after the IT for the warfighter.”

Through this initiative, Beavers said the challenges are much different, ranging from a huge install base to legacy technology not designed to be interoperable and a limited budget.

At same time, Beavers said there’s a lot of opportunity to make some improvements to the user experience.

“We should make a concerted effort to look at where our policies are standing in the way of the interoperability. Where do we need an engineering solution? And where do we need just a process change?” she said. “The department is really pretty good at buying big things over long periods of time and buying quick things and bringing them when there’s an imperative like a war. But it’s not ingrained as part of the standard operating procedure in the department as much as we would like so we’re working on building that piece out, to help bring in the new technology and also to improve the customer experience.”

DoD, VA collaboration

Beavers added DoD is using the Lean Six Sigma business process improvement approach to help sort through the potential changes and to better understand the broader impacts of process and policy revisions.

Some recent work with the Veterans Affairs Department is a customer experience win, Beavers said.

At the North Chicago Veterans Medical Center, VA and DoD staff have worked closely together for the past decade or more. But their systems and networks were separate and data sharing was basically non-existent.

She said in some cases, it would take around 36 mouse clicks to send an email between the DOD and the VA.

“We spent the last six months pivoting to Office 365 in the cloud and turning on some business functionality,” Beavers said. “This really was a cooperation problem where the security folks on both sides had to decide to configure the clouds the same way to enable that interoperability. We are rolling that out now to the people working in less than six months.”

The post DoD’s approach to fix its computers is function over form first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/03/dods-approach-to-fix-its-computers-is-function-over-form/feed/ 0
Drones becoming central to a variety of CBP’s mission sets https://federalnewsnetwork.com/ask-the-cio/2024/03/drones-becoming-central-to-a-variety-of-cbps-mission-sets/ https://federalnewsnetwork.com/ask-the-cio/2024/03/drones-becoming-central-to-a-variety-of-cbps-mission-sets/#respond Fri, 15 Mar 2024 19:46:38 +0000 https://federalnewsnetwork.com/?p=4927194 Quinn Palmer, the National Operations Director for small unmanned aircraft systems at CBP, said drones are bringing more benefits to the agency every year.

The post Drones becoming central to a variety of CBP’s mission sets first appeared on Federal News Network.

]]>
From search and rescue to intelligence, surveillance and reconnaissance to inspecting towers, Customs and Border Protection is demonstrating how drones are more than just a fun hobby.

CBP is recognizing not only the time and cost savings, and more importantly the safety to officers that small, unmanned aircraft can provide.

Quinn Palmer, the National Operations Director for small unmanned aircraft systems at CBP in the Homeland Security Department, said the use of drones has evolved across the agency’s mission sets.

“Small drones are really filling a critical niche between fixed surveillance systems and crewed aviation or manned aviation assets because of their range, because of their price point and the quick deploy ability,” Palmer said on Ask the CIO. “They can offer us surveillance over a much larger area on the border, like for search and rescue where we can cover broad swaths of territory very quickly. But another interesting piece of that is the nature of the drone, meaning its covertness, that’s been a hugely impactful component to how why drones are so valuable to us and to our agents in the field. What I mean by that is having the ability to surveil a target or a law enforcement situation covertly or silently allows our folks that situational awareness, that critical time element, to prepare more smartly to position themselves to make that initial engagement, which lends itself to officer safety, but also to the effectiveness of the law enforcement resolution.”

This type of impact is true across many CBP mission sets. From border surveillance and related missions to facility and tower inspections to creating training videos, using drones, for internal communications, the agency is using these unmanned small aircraft systems in more ways than ever imagined.

CBP flew 100,000 sorties in 2023

To that end, Palmer said CBP has grown its drone pilot crew to about 2,000 strong operating more than 330 systems from just half a dozen systems and 20 operators a about five years ago. It plans to grow to more than 500 assets and continue to train and hire operators in 2024.

“The response by the field, by the folks that are out there on the front line, are really engaging in and advocating for this capability in this technology. The leadership now see the value too,” Palmer said. “It’s always a trade off when you’ve got a workforce that’s stretched amongst many competing requirements and commitments, adding one more thing to do is something we’ve got to be very conscious about. It can be a distraction. It can be a negative to the labor cost of conducting a border security mission. But drones have not been that. It’s been a labor saving capability. We see an effect at the ground level, but not just in the price tag but in the time it takes to resolve law enforcement situations.”

In 2023, CBP flew about 20% of all of the direct air support missions for ground agents of the border patrol. From those flights came 48% of all apprehensions and seizures, Palmer said.

“We’re putting out about 25% of the output, but yielding about 50% of the outcome. That’s due to the proliferation of more drones being more places than manned aviation, but also the nature of the drone being covert and the effectiveness it lends its self to that interdiction aspect,” he said. “We apprehended about 42,000 folks crossing the border illegally. In fiscal 2020 through 2023, about 2,800 pounds of narcotics were seized, 95 vehicles seized and 13 weapons seized. That resulted from about 100,000 sorties about 50,000 hours flown.”

Sustainment plans for drones

All of those efforts in using drones instead of manned aviation in 2023 resulted in about $50 million in cost avoidance. Palmer said that money can be put back into mission and operational priorities helping the agency extend its limited budget.

“We’re actually benefiting not just from the cost savings associated with deploying drones versus some of these other more expensive surveillance capabilities. But we’re also benefiting because we’re able to control that interdiction much more efficiently, which translates into savings on the ground level because the labor costs associated with and the time associated with accomplishing that interdiction, and that resolution is minimized,” he said. “In many different ways, we found that drones are impacting and it’s not just from the budgetary standpoint, but they’re impacting the tactical advantage in the field.”

As with any new technology, CBP is learning how to manage the drones and educating the industry.

For example, the agency runs drones in austere environments whether cold, heat, dust or precipitation in a way that many manufacturers didn’t intend the systems to run in.

“We are using our equipment a lot compared to some of the other drone users in the United States. We’ve had industry partners say we never intended to fly this this much. We’re like, ‘well, don’t sell it to us,” Palmer joked.

Palmer said this means having a strict sustainment plan is more important than ever to keep the drones flying.

“This gentlemen at the National Transportation Safety Board (NTSB) told me this, and I’ll share it with you because I was thought it was very relevant. Drones are engineered to do very sophisticated things. But they’re engineered also at the same level as the toaster on your kitchen counter. So we do very intricate and very sophisticated things with drones, but they are consumable, for lack of better term,” he said. “We do have for our higher costing assets have sustainment plans and lifecycle plans associated to those acquisitions We do our due to our hard work to make sure that that that battery rotation and those kits are tracked and the motor arms and the propellers are replaced per manufacturer specifications. We’re doing all those kinds of things on the ground. But ultimately, small drone is should be considered as a consumable. They’re just not built to sustain.”

At the same time, Palmer said the marketplace is moving so fast that CBP or any organization could move to the next generation fairly quickly and inexpensively outweighing the cost of long-term sustainment plans.

The post Drones becoming central to a variety of CBP’s mission sets first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/03/drones-becoming-central-to-a-variety-of-cbps-mission-sets/feed/ 0
GSA’s emerging tech framework is a priority setter for AI https://federalnewsnetwork.com/ask-the-cio/2024/03/gsas-emerging-tech-framework-is-a-priority-setter-for-ai/ https://federalnewsnetwork.com/ask-the-cio/2024/03/gsas-emerging-tech-framework-is-a-priority-setter-for-ai/#respond Fri, 08 Mar 2024 20:02:54 +0000 https://federalnewsnetwork.com/?p=4918684 Eric Mill, director of cloud strategy at GSA, said comments on the draft Emerging Technology Framework are key to ensuring their decision process is correct.

The post GSA’s emerging tech framework is a priority setter for AI first appeared on Federal News Network.

]]>
var config_4918782 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB7524394150.mp3?updated=1709926645"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"GSA\u2019s emerging tech framework is a priority setter for AI","description":"[hbidcpodcast podcastid='4918782']nnWhen it comes to adopting secure artificial intelligence capabilities, the General Services Administration is doing all it can to make sure the government isn\u2019t late to the game.nnThe draft Emerging Technology Framework from the cloud security program known as FedRAMP could be a key piece to that effort, especially if industry and agencies help drive the new approach.nnEric Mill, director of cloud strategy in the Technology Transformation Service in GSA, said the <a href="https:\/\/www.fedramp.gov\/2024-01-26-fedramps-emerging-technology-prioritization-framework-overview-and-request-for-comment\/" target="_blank" rel="noopener">draft framework<\/a>, for which comments are due March 11, is helping to ensure agencies get the expected benefits of using secure AI and large language models.nn[caption id="attachment_4918702" align="alignright" width="300"]<img class="size-medium wp-image-4918702" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/03\/eric-mill-300x300.jpg" alt="" width="300" height="300" \/> Eric Mill is the director of cloud strategy in the Technology Transformation Service in the General Services Administration.[\/caption]nn\u201cThis is strategically important for the program because what we're doing here is FedRAMP is prioritizing its work around the strategic goals that the government has. It's not just a first in, first out program. We are breaking a little bit of ground for the program,\u201d Mill said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cIt is that something we think is a good thing. As we engage in a prioritization process where FedRAMP is really important for <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2024\/01\/fedramp-still-a-steep-climb-12-years-in\/">what FedRAMP does<\/a>, we have to make sure it\u2019s well understood, that we are transparent to stakeholders, that it is fair and clear. That's the foundation we're trying to lay with this framework.\u201dnnGSA released the draft framework in late January as part of its effort to meet the requirements of the <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2023\/10\/biden-ai-executive-order-calls-for-talent-surge-across-government-to-retain-tech-experts\/">AI executive order<\/a> President Joe Biden signed in October. In the document, GSA says it\u2019s initially focused on emerging technology capabilities that use large language models (LLMs) and include chat interfaces, code-generation and debugging tools and prompt-based image generators.nnMill said the framework will help prioritize and manage the excitement around AI and LLMs.nn\u201cHow do we strike the right balance? And, then, how do we operationalize that? How is it that we are prioritizing this thing in effect and that means having to come up with things like limits?\u201d he said. \u201cSo part of what you see in the framework is the proposal that we stop at three. When we have three services that are based around chatbots, for example, using generative AI, and we've prioritized three of those things, we're going to stop prioritizing that until we come back around and think again about what the priorities of FedRAMP should be. That is making sure that when we say prioritize, we're actually prioritizing, and we're not just focusing on AI as a program. FedRAMP is a program for the entire cloud market. But we want to be able to support this initiative so this is important strategically for figuring out how we answer those kinds of questions that are not at all totally AI specific.\u201dn<h2>GSA to manage concerns over backlogs<\/h2>nThat prioritization and limits to the number of cloud services is exactly why Mill said GSA is pushing vendors and others to comment on the draft framework.nnHe acknowledged the limitations, especially around AI, could cause some heartburn for vendors. FedRAMP already is seeing a lot of interest from vendors and agencies alike around AI and LLM services in the cloud.nn\u201cWe definitely are seeing some services that are have already been in the marketplace that have added AI capabilities. We're seeing things come in through the agency review process. We're expecting that to go up,\u201d Mill said. \u201cWe\u2019re not responding to an abstract thing, but the things that we actually see coming in front of us.\u201dnnOne of the big issues GSA still must address is what are the metrics or benchmarks it should use to determine if a technology fits into one of the three priority categories.nnMill said GSA is aware of possible backlogs building of vendors asking for their AI capability to go through the review process, and then that creating a bigger backlog for more typical cloud services.nn\u201cWe very much are intent on making sure that the urgency that we see around accelerating the government's use of emerging technologies doesn't compete with those other things. That it doesn't worsen the problem,\u201d he said. \u201cThat is part of what we mean when we talk about the prioritization process and some of the limits associated. That's how we're ultimately going to make sure that the program stays responsive. We're very engaged on short and long term structural changes to make sure that the program is operating at the pace that it should. We are <a href="https:\/\/federalnewsnetwork.com\/cloud-computing\/2024\/03\/amid-fedramp-reforms-gsa-cloud-lead-says-speed-is-a-security-property\/">treating speed as the security property<\/a> that we know that cloud providers and agencies all believe in as well. That's the spirit that you should see from us. And we'll have a lot more to say later this year.\u201dn<h2>More on tap for FedRAMP<\/h2>nMill said he couldn\u2019t speak to the timeline to get the version 1 of the framework out. He said he doesn\u2019t expect GSA to sit on the comments and any updates from those comments for a long time. But, he said, it also will depend on what people say about the framework and how much GSA got correct already.nn\u201cI think we're very much expecting for this to be an iterative process. This is not going to be the only bite at the apple for engaging with the FedRAMP team about this framework. Folks should feel absolutely feel free to reach out and suggest how we can do better on that,\u201d he said. \u201cWe did put we put a lot of effort into that [blog] post to sharpen those questions. We absolutely encourage folks to go read the announcement and on this questions. Chief among them is, this question of are we are we measuring this right? I think the concept of prioritization means making some kind of hard choice somewhere, so when the agency does that, we want to know that, at the very least, everybody understood why we would make that decision and what factors went into that.\u201dnnMill said beyond the finalizing the framework in the coming months, other priorities for FedRAMP center on improving the customer experience, both agency and industry users, and understanding the costs involved in obtaining approval.nnMill said GSA is trying to make sure it is on the same page with vendors about the time and cost to get through the security process.nn\u201cWhat we think it takes, is it the same as what the cloud providers think is one of the exercises that we're going to be engaged on this year. We are updating what some of the key metrics are around that and talking pretty directly with stakeholders before we finalize those things. We will be keeping a feedback loop so that we are really orienting ourselves formally as a customer oriented program in that way,\u201d he said. \u201cI think you'll see us engaging in that in a more in a pretty public way, maybe in a more tangible, mechanical sense. We're definitely focused on speed as a security property. We're definitely very interested in in identifying cloud providers that want to want to pilot different ways of working. There's never been a more open mind to looking at process changes and piloting different approaches that don't lower the bar for security, but allow us to focus the review energy on the process and on the items that we all understand are the most closely tied to security.\u201dnnOf course, Mill said once the <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook\/2023\/12\/fedramp-draft-memo-elicits-optimism-but-more-details-needed\/">draft memo<\/a> from the Office of Management and Budget is finalized, a whole new set of priorities will open up.nn\u201cI hope folks see there is a sense of energy and responsiveness where the program wants to hear where it can change and where it can do a better job of threading that eternal needle of speed, security and everything else people want from the system,\u201d he said. \u201cIt is not trivial, but it is the whole job of the program. I think there's going to be not just this Emerging Technology Framework, but a pretty good series of feedback opportunities over the course of the year. I really encourage folks who come at that with the spirit of improving these processes, and feel please bring up things that maybe died on the vine a few years ago. But let's not let the past foreclose the future. There's not been a more open minded period of time in the program than I think what's there right now.\u201d"}};

When it comes to adopting secure artificial intelligence capabilities, the General Services Administration is doing all it can to make sure the government isn’t late to the game.

The draft Emerging Technology Framework from the cloud security program known as FedRAMP could be a key piece to that effort, especially if industry and agencies help drive the new approach.

Eric Mill, director of cloud strategy in the Technology Transformation Service in GSA, said the draft framework, for which comments are due March 11, is helping to ensure agencies get the expected benefits of using secure AI and large language models.

Eric Mill is the director of cloud strategy in the Technology Transformation Service in the General Services Administration.

“This is strategically important for the program because what we’re doing here is FedRAMP is prioritizing its work around the strategic goals that the government has. It’s not just a first in, first out program. We are breaking a little bit of ground for the program,” Mill said on Ask the CIO. “It is that something we think is a good thing. As we engage in a prioritization process where FedRAMP is really important for what FedRAMP does, we have to make sure it’s well understood, that we are transparent to stakeholders, that it is fair and clear. That’s the foundation we’re trying to lay with this framework.”

GSA released the draft framework in late January as part of its effort to meet the requirements of the AI executive order President Joe Biden signed in October. In the document, GSA says it’s initially focused on emerging technology capabilities that use large language models (LLMs) and include chat interfaces, code-generation and debugging tools and prompt-based image generators.

Mill said the framework will help prioritize and manage the excitement around AI and LLMs.

“How do we strike the right balance? And, then, how do we operationalize that? How is it that we are prioritizing this thing in effect and that means having to come up with things like limits?” he said. “So part of what you see in the framework is the proposal that we stop at three. When we have three services that are based around chatbots, for example, using generative AI, and we’ve prioritized three of those things, we’re going to stop prioritizing that until we come back around and think again about what the priorities of FedRAMP should be. That is making sure that when we say prioritize, we’re actually prioritizing, and we’re not just focusing on AI as a program. FedRAMP is a program for the entire cloud market. But we want to be able to support this initiative so this is important strategically for figuring out how we answer those kinds of questions that are not at all totally AI specific.”

GSA to manage concerns over backlogs

That prioritization and limits to the number of cloud services is exactly why Mill said GSA is pushing vendors and others to comment on the draft framework.

He acknowledged the limitations, especially around AI, could cause some heartburn for vendors. FedRAMP already is seeing a lot of interest from vendors and agencies alike around AI and LLM services in the cloud.

“We definitely are seeing some services that are have already been in the marketplace that have added AI capabilities. We’re seeing things come in through the agency review process. We’re expecting that to go up,” Mill said. “We’re not responding to an abstract thing, but the things that we actually see coming in front of us.”

One of the big issues GSA still must address is what are the metrics or benchmarks it should use to determine if a technology fits into one of the three priority categories.

Mill said GSA is aware of possible backlogs building of vendors asking for their AI capability to go through the review process, and then that creating a bigger backlog for more typical cloud services.

“We very much are intent on making sure that the urgency that we see around accelerating the government’s use of emerging technologies doesn’t compete with those other things. That it doesn’t worsen the problem,” he said. “That is part of what we mean when we talk about the prioritization process and some of the limits associated. That’s how we’re ultimately going to make sure that the program stays responsive. We’re very engaged on short and long term structural changes to make sure that the program is operating at the pace that it should. We are treating speed as the security property that we know that cloud providers and agencies all believe in as well. That’s the spirit that you should see from us. And we’ll have a lot more to say later this year.”

More on tap for FedRAMP

Mill said he couldn’t speak to the timeline to get the version 1 of the framework out. He said he doesn’t expect GSA to sit on the comments and any updates from those comments for a long time. But, he said, it also will depend on what people say about the framework and how much GSA got correct already.

“I think we’re very much expecting for this to be an iterative process. This is not going to be the only bite at the apple for engaging with the FedRAMP team about this framework. Folks should feel absolutely feel free to reach out and suggest how we can do better on that,” he said. “We did put we put a lot of effort into that [blog] post to sharpen those questions. We absolutely encourage folks to go read the announcement and on this questions. Chief among them is, this question of are we are we measuring this right? I think the concept of prioritization means making some kind of hard choice somewhere, so when the agency does that, we want to know that, at the very least, everybody understood why we would make that decision and what factors went into that.”

Mill said beyond the finalizing the framework in the coming months, other priorities for FedRAMP center on improving the customer experience, both agency and industry users, and understanding the costs involved in obtaining approval.

Mill said GSA is trying to make sure it is on the same page with vendors about the time and cost to get through the security process.

“What we think it takes, is it the same as what the cloud providers think is one of the exercises that we’re going to be engaged on this year. We are updating what some of the key metrics are around that and talking pretty directly with stakeholders before we finalize those things. We will be keeping a feedback loop so that we are really orienting ourselves formally as a customer oriented program in that way,” he said. “I think you’ll see us engaging in that in a more in a pretty public way, maybe in a more tangible, mechanical sense. We’re definitely focused on speed as a security property. We’re definitely very interested in in identifying cloud providers that want to want to pilot different ways of working. There’s never been a more open mind to looking at process changes and piloting different approaches that don’t lower the bar for security, but allow us to focus the review energy on the process and on the items that we all understand are the most closely tied to security.”

Of course, Mill said once the draft memo from the Office of Management and Budget is finalized, a whole new set of priorities will open up.

“I hope folks see there is a sense of energy and responsiveness where the program wants to hear where it can change and where it can do a better job of threading that eternal needle of speed, security and everything else people want from the system,” he said. “It is not trivial, but it is the whole job of the program. I think there’s going to be not just this Emerging Technology Framework, but a pretty good series of feedback opportunities over the course of the year. I really encourage folks who come at that with the spirit of improving these processes, and feel please bring up things that maybe died on the vine a few years ago. But let’s not let the past foreclose the future. There’s not been a more open minded period of time in the program than I think what’s there right now.”

The post GSA’s emerging tech framework is a priority setter for AI first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/03/gsas-emerging-tech-framework-is-a-priority-setter-for-ai/feed/ 0