Cloud Computing - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Wed, 19 Jun 2024 22:39:39 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Cloud Computing - Federal News Network https://federalnewsnetwork.com 32 32 Energy working with renewables industry, cloud providers on cyber requirements https://federalnewsnetwork.com/cybersecurity/2024/06/energy-working-with-renewables-industry-cloud-providers-on-cyber-requirements/ https://federalnewsnetwork.com/cybersecurity/2024/06/energy-working-with-renewables-industry-cloud-providers-on-cyber-requirements/#respond Wed, 19 Jun 2024 19:23:35 +0000 https://federalnewsnetwork.com/?p=5046283 CESER's work with cloud service providers comes amid growing threats to critical infrastructure, as well as questions about cloud security responsibilities.

The post Energy working with renewables industry, cloud providers on cyber requirements first appeared on Federal News Network.

]]>
The Energy Department’s cybersecurity office will work with cloud service providers and the renewable energy industry this year to help delineate cyber protection requirements for the sector.

The work is being led out of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response (CESER). It comes amid growing concerns about hackers infiltrating U.S. critical infrastructure, including the electric grid.

Puesh Kumar, the director of CESER, said “traditional large fossil generation” is often prohibited by regulations from using the cloud. But he said renewable energy providers are often starting out by relying on cloud computing.

“But really, we haven’t really sat down to define what are the security requirements? Who owns what part of the security picture? Is that the owner and operator? Or is it the cloud service provider?” Kumar said during a cybersecurity panel discussion hosted by Semafor in Washington on Tuesday.

“One of the big efforts that we’re going to be undertaking this year is really bringing together companies like [Google], to actually come together and establish those requirements for both sides, so that we can set up the energy sector of the future with that security built in,” Kumar added.

The CESER office is tasked with addressing emerging threats to energy infrastructure, including cyber risks, climate change and physical security. CESER is leading several initiatives to secure new energy technologies from cyber threats. Those programs are funded as part of the $27 billion Congress provided the Energy Department to modernize the electric grid in the 2021 Infrastructure Investment and Jobs Act.

Kumar said the energy sector is going through “tremendous change” right now.

“We’re trying to combat the climate risk,” he said. “We’re trying to deploy more clean energy. We’re trying to deploy more renewables and electric vehicles and all that’s really great. And that can be a source of resilience in our energy sector in the United States. It can bring online more generation that hasn’t been online into our grid. But we also have to do that with security in mind. And so, as we’re fundamentally changing this grid, we have to ensure that security is baked into it.”

In addition to cyber threats targeting the electric grid, policymakers are also focusing more on the so-called “shared responsibility model” that lays out the cybersecurity responsibilities of cloud providers and their customers. The security responsibilities of cloud providers has come under particular scrutiny in the wake of China’s hack into Microsoft’s cloud email infrastructure last year.

Jeanette Manfra, global director for security and compliance at Google, argued large cloud providers can make security “cheaper and easier” for their customers. Manfra is a former Cybersecurity and Infrastructure Security Agency official.

“There’s a huge opportunity to leverage that scale, and to drive cloud providers to increase that level of security and safety and reliability,” Manfra said during the Semafor event. “I do believe it is the responsibility of cloud providers, particularly the largest ones, who are increasingly serving more and more critical infrastructure sectors, to have that high bar of security and safety. But there’s also risk because you start to consolidate on just a few companies. And so you have to think about what does that mean, that concentration risk? You have to think from a policy perspective of how you both leverage that opportunity, while also managing that potential concentration risk.”

The post Energy working with renewables industry, cloud providers on cyber requirements first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/06/energy-working-with-renewables-industry-cloud-providers-on-cyber-requirements/feed/ 0
Navy project brings promise of cloud to the middle of the ocean https://federalnewsnetwork.com/navy/2024/06/navy-project-brings-promise-of-cloud-to-the-middle-of-the-ocean/ https://federalnewsnetwork.com/navy/2024/06/navy-project-brings-promise-of-cloud-to-the-middle-of-the-ocean/#respond Mon, 17 Jun 2024 12:03:54 +0000 https://federalnewsnetwork.com/?p=5043003 Aboard the U.S.S. Abraham Lincoln, the Navy is figuring out what's possible when is has enormous data pipes that have never before been available to ships.

The post Navy project brings promise of cloud to the middle of the ocean first appeared on Federal News Network.

]]>

From virtual desktops to email and collaboration, the Navy has been leaning heavily on cloud services to speed up its digital modernization efforts. But those efforts have come with a big question: Will any of this work aboard ships? It turns out the answer is yes.

In a pilot project, the Navy has shown it’s possible to consistently move several terabytes of data each day between the cloud and thousands of users onboard an aircraft carrier every single day, an advance officials say is a “game changer”

The project is called Flank Speed Edge, an extension of Flank Speed, the Navy’s broader cloud environment. The largest test case has been aboard the U.S.S. Abraham Lincoln, which is currently underway in the Pacific, and represents the first major example of the Navy connecting a vessel at sea with cloud services in a way that’s on par with what sailors get on shore.

Leveraging P-LEO satellites

It’s mostly thanks to the advent of Proliferated Low Earth Orbit (P-LEO) satellite services — massive constellations of small satellites that form mesh networks via optical links with one another in space, and deliver high-bandwidth, low latency communications to users back on Earth.

Cmdr. Kevin White, the combat systems officer aboard the Lincoln, said the initial idea was to install a gigabit’s worth of satellite connectivity aboard the ship and see what the ship’s 5,000 sailors and Marines could do with it. It turns out, quite a lot.

“I’ve seen a tremendous value from from this afloat. All of the staff are using their Flank Speed capabilities to maintain continuity,” he told the Navy CIO’s recent IT conference in Norfolk, Virginia, during a live video demonstration from the Pacific Ocean. “They’re using their NMCI phones to call home over voice over IP, or to call the beach to say, ‘Hey, I need this part rushed to the ship.’ We’re using it across all of our departments and embarked commands for quality-of-work type areas. Everything from our training department — ensuring that all of our readiness in our training cycle is up to date — to our medical department, to our supply department, they’re all reaching out over websites and services to ensure that we have continuity of operations, and ensure that this ship is ready to go when the time comes that we have to turn these services off.”

One thing the Navy has learned from the Lincoln experience is that Flank Speed Edge doesn’t require a huge amount of manpower. It’s taken just three full-time sailors to operate and maintain the new satellite and Wi-Fi infrastructure aboard the carrier.

And in return, it’s also dramatically expanded the kinds of software upgrades and updates that can be performed on other systems on the ship, White said. Traditionally, that’s the kind of work that can only be done at a pier with a physical network connection.

“While we’re out at sea right now, with this P-LEO capability, a cloud connected node and all the right elements in place, we’re able to scale new capabilities as they become available and rapidly deploy them while they’re monitored from the shore side,” he said. “One of the big challenges we have is the cycle of Windows updates and the cycle of patches, and with that high-speed capability, we can have those update services enabled. Onboard, we have 2,000 staff folks, all of which are live at their home commands on Flank Speed. Imagine a future where we are able to migrate that data to an embarkable [laptop], and allow them to interoperate with that data when we have to turn off our connections.”

The approach does have its limitations. Besides the obvious need to sometimes shut down those high-speed data links for operational reasons — leaving the ship with only its onboard tactical cloud nodes — the P-LEO connections, so far, are only authorized for unclassified data.

But White said the on-board infrastructure is designed to be transport agnostic — so that it can use whatever connectivity mechanism is available — from traditional military SATCOM to commercial services like Starlink. It’s also designed to incorporate software defined networking, so that the network capacity available through those data links can be used however the Navy sees fit, and can be reallocated on the fly.

“Right now our logs are showing that we’re able to pass between 3 and 5 terabytes of data per day, which is absolutely massive. And what we’re able to do with software defined networks is scale exactly how that data is used,” he said. “Right now we’re demonstrating pushing applications like air wing maintenance apps that live in the cloud, and all of our pay and personnel apps. And that’s just scratching the surface.”

Other applications ashore

The Navy is using similar concepts in other places of the world that may not be as hard to connect as ships, but still have tended to have communications challenges.

The service’s 5th Fleet is serving as a pilot site for a shore-based implementation of Flank Speed Edge. At the command’s headquarters in Bahrain, staff have recently started using Flank Speed services, including Nautilus Virtual Desktop.

Lt. Cmdr. Tricia Nguyen, a staff member at Naval Computer and Telecommunications Station Bahrain said so far, the Flank Speed approach has turned out to be more seamless and resilient than the Navy’s traditional overseas networks.

“It is a vast improvement compared to the previous assets and legacy architecture,” she said. “The user interface is quick and responsive — applications are able to be opened natively instead of using browser-based workarounds. Simple things matter here: The file sync is seamless. I don’t have to log in multiple times like I used to; now I just boot up and my files are there. And back in March, there was a Teams service outage, which I understand was worldwide. However, here in Bahrain, we did not experience an outage at all. That was because of the architecture: We have a primary and secondary means of transport that are terrestrial based, and a tertiary that’s commercial satellite. We had an automatic failover and it was completely seamless and transparent to our end users. I didn’t even know about it until after the fact.”

Bob Stephenson, the chief information officer for U.S. Pacific Fleet, said some of what the Navy has learned through the pilots — especially their uses of secure WiFi — may also be applicable to communications on installations, such as his command’s headquarters at Pearl Harbor.

“We’ve been using the same technology in our buildings that we’ve used since the late 90s. As our staff changes and grows, it’s very difficult for us with a wired infrastructure to bring more people into the building, or rearrange the office,” he said. “So we’re doing a pilot now sponsored by PEO Digital where we’ve gone to wireless in the buildings. We still have to use fiber for our secret networks, and we’d like to change that, but this is going to give us an enormous capability to modernize our buildings like we’re modernizing our ships.”

 

The post Navy project brings promise of cloud to the middle of the ocean first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/navy/2024/06/navy-project-brings-promise-of-cloud-to-the-middle-of-the-ocean/feed/ 0
New FedRAMP updates: 5 ways federal agencies can evaluate and select the safest cloud providers https://federalnewsnetwork.com/commentary/2024/06/new-fedramp-updates-5-ways-federal-agencies-can-evaluate-and-select-the-safest-cloud-providers/ https://federalnewsnetwork.com/commentary/2024/06/new-fedramp-updates-5-ways-federal-agencies-can-evaluate-and-select-the-safest-cloud-providers/#respond Wed, 05 Jun 2024 17:59:56 +0000 https://federalnewsnetwork.com/?p=5028857 As federal agencies ride the wave of digital transformation and embrace cloud services, the landscape of cybersecurity continues to present complex challenges.

The post New FedRAMP updates: 5 ways federal agencies can evaluate and select the safest cloud providers first appeared on Federal News Network.

]]>
The primary purpose of the Federal Risk and Authorization Management Program is to ensure that federal agencies can leverage the benefits of modern cloud technologies while upholding stringent security standards. FedRAMP serves as a benchmark of security assurance in the ever-expanding cloud landscape, offering a framework that helps federal agencies evaluate and select cloud providers with the highest rigor for data protection. However, there has been speculation around whether FedRAMP is fit for purpose in an increasingly complex cyber threat environment. After all, certification lags the standard by a few years, and the standard lags in identifying control mechanisms to thwart emerging cyber threats.

According to a recent public memo from The White House, “Because federal agencies require the ability to use more commercial [Software-as-a-service] products and services to meet their enterprise and public-facing needs, the FedRAMP program must continue to change and evolve.”

This evolution has now begun. Recent updates to FedRAMP have been driven by several key imperatives. First, the program needed to scale to accommodate the growing demand for cloud services across federal agencies. Second, it aimed to mature by refining its focus on the most critical aspects of data security. Third, efforts were made to streamline the software authorization process, making it more efficient and accessible. Finally, reducing costs was a central goal, making cloud adoption more viable for agencies of all sizes. In essence, these updates represent a commitment to ensuring that FedRAMP remains a robust and adaptable tool for safeguarding federal data in the face of evolving security challenges.

The threats facing federal agencies

The timing couldn’t be worse. Just as agencies are being asked to modernize and embrace cloud services, the risk factor of moving workloads into the cloud has increased manyfold. In the wake of geopolitical turmoil and the democratization of advanced AI-based technologies, federal agencies must now navigate a minefield of cybersecurity challenges while orchestrating their migration and selecting cloud partners. Access to new technologies has armed cybercriminals, state actors and malicious entities with unprecedented access to hacking techniques and tools. We now operate in a world where AI/ML algorithms can be used to create malicious code, where social engineering and identity theft are more sophisticated than ever, and where software supply chains are only as strong as their weakest link. What’s more, the alarming emergence of ransomware-as-a-service – malicious software that’s readily available on the darknet – poses a substantial danger. In this environment, federal agencies must prioritize advanced security measures in their cloud services, recognizing the imperative of safeguarding sensitive data and systems from these evolving and multifaceted threats.

5 criteria federal agencies should use when selecting cloud providers

FedRAMP remains a key framework for security assurance and its updates will prove useful, but in the wake of mounting threats, here is a selection of criteria that chief information officers, chief information security officers and chief technology officers in federal agencies should consider when selecting a cloud provider.

  1. Embrace a “defense-in-depth” approach

One fundamental principle of cloud security is adopting a “defense-in-depth” strategy. Federal agencies should seek cloud service and SaaS providers that employ multiple layers of control mechanisms to protect their data assets, including perimeter security, application security and data encryption. This approach ensures that even if one layer of security is breached, others remain intact, halting potential threats.

  1. Explore beyond FedRAMP standards

While FedRAMP provides a robust framework for cloud security, forward-thinking agencies should explore additional security measures. For example, they should consider if their preferred SaaS solution provider has implemented a zero trust architecture, ensuring that data can only be accessed on a “need-to-know” basis. Solutions that have deployed artificial intelligence-based security methods for threat analysis and detection, and user behavior analysis, will also stand agencies in good stead, particularly when it comes to monitoring software supply chains and the flow of data.

  1. Assess qualifying authorizations

Federal agencies should evaluate cloud providers not only based on FedRAMP requirements but also on other qualifying authorizations they may possess. Consider providers with certifications such as System and Organization Controls (SOC) 2, relevant International Organization for Standards (ISO) standards, or special designations such as AWS Government Competencies to meet the stringent security requirements of public agencies. Microsoft also has certifications such as FedRAMP scores and DoD impact level ratings which can help agencies understand the suitability of various services.

  1. Examine partner network maturity

A cloud provider’s partner network plays a pivotal role in security. Assess the maturity and reliability of partners like CrowdStrike, AWS and Microsoft. A strong partner network can enhance an agency’s overall security posture.

  1. Verify proactive security measures

Staying ahead of evolving threats is crucial. Confirm that the chosen cloud provider has a proven track record of proactive security measures and innovations. Leading providers continuously evolve their offerings to protect data hosted in their environment, often including real-time analytics and threat monitoring.

As federal agencies ride the wave of digital transformation and embrace cloud services, the landscape of cybersecurity continues to present complex challenges. The recent updates to FedRAMP signify a commitment to adaptability in the face of these evolving threats, but to safeguard their data, federal CIOs, CISOs and CTOs should look beyond government frameworks to ensure their cloud adoption strategies can move forward with confidence. Making informed choices about cloud providers is not just a matter of compliance but a critical step in securing the future of federal agencies and the fulfillment of their charter.

Manish Sharma is the senior vice president of engineering and security at Aurigo Software.

The post New FedRAMP updates: 5 ways federal agencies can evaluate and select the safest cloud providers first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/06/new-fedramp-updates-5-ways-federal-agencies-can-evaluate-and-select-the-safest-cloud-providers/feed/ 0
Army turning up cyber protections of network, data access https://federalnewsnetwork.com/army/2024/05/army-turning-up-cyber-protections-of-network-data-access/ https://federalnewsnetwork.com/army/2024/05/army-turning-up-cyber-protections-of-network-data-access/#respond Fri, 31 May 2024 22:13:44 +0000 https://federalnewsnetwork.com/?p=5023208 Maj. Gen. Chris Eubank, commander of NETCOM, said soldiers and civilians will no longer be able to download data to their devices from outside the Army network.

The post Army turning up cyber protections of network, data access first appeared on Federal News Network.

]]>
var config_5025253 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3232534501.mp3?updated=1717413552"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Army turning up cyber protections of network, data access","description":"[hbidcpodcast podcastid='5025253']nnPHILADELPHIA \u2014 The Army is making a major change to how soldiers and civilians access data through their email and other applications in early June.nnStarting on June 11, the Army is shutting down the network port that lets users pull data through commercial internet providers onto their laptops or cell phones.nnMaj. Gen. Chris Eubank, commander of the Network Enterprise Technology Command (NETCOM), said the decision to turn off what is commonly known as \u201cFlow 3\u201d came down to two factors. One is basic cybersecurity and protecting data and networks. The second, however, was the maturity of the Army\u2019s virtual desktop initiative (VDI) and overall network architecture.nn[caption id="attachment_5023235" align="alignright" width="450"]<img class="wp-image-5023235" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/05\/chris-eubank-300x169.jpg" alt="" width="450" height="253" \/> Maj. Gen. Chris Eubank is the commander of the Network Enterprise Technology Command (NETCOM).[\/caption]nn\u201cWhat we're really going to shut down is the ability to go into the Army's network and pull the information through the internet to your device, whether it's a government furnished device or a personal device. What we're doing is we're going to cut off that access so you'll still be able to get to those services, via your personal device using a Common Access Card (CAC) or from a government furnished piece of equipment using a CAC using the commercial internet, but it's all going to be through virtual means,\u201d said Eubank during an interview with Federal News Network at the Army TEMS conference last week. \u201cUsing technologies in our bring-your-own-device, remote capable workforce portfolio like Azure virtual desktop, individuals will still be able to plug in via the commercial internet CAC enabled get to that information, but they will not be able to pull the information out of that environment. It will stay resident in in the cloud. When they disconnect their session, there's nothing left behind [on the device]. It's really about protecting both the network and our workforce.\u201dnnThe Army has provided this type of access through the commercial internet for years, specifically for <a href="https:\/\/federalnewsnetwork.com\/army\/2022\/08\/army-nearly-ready-to-move-thousands-of-users-to-byod-virtual-desktop-programs\/">members of the guard and reserves<\/a>. This capability became even more critical during the pandemic when more soldiers and civilians worked remotely.n<h2>Army needs to increase data protections<\/h2>nEubank said because the <a href="https:\/\/federalnewsnetwork.com\/navy\/2024\/02\/navy-used-threat-of-cyber-vulnerability-to-expand-vdi\/">threat landscape has changed<\/a> so dramatically in the last three or four years, the Army made the decision to shut off the ability to download data through commercial internet providers. Eubank signed a strategic communications message in the beginning of May to initiate this change.nnHe said NETCOM is trying to make this transition easier for soldiers and civilians by providing them with a QR code to download the VDI application.nn\u201cThey can click on a link and it'll sign them up for Azure virtual desktop. They can do the same thing on Hypori. That enables them to get that account setup and then if you have any questions all you have to do is reach out to NETCOM,\u201d he said.nnJared Shepard, CEO and President of Hypori, said in an email to Federal News Network that these steps will make a big difference for how the Army provides secure access unclassified network resources, while also reducing the attack surface and potential loss of controlled unclassified information (CUI) data.nnJeff Duran,\u00a0 an Army Reservist who also serves as a contractor to Hypori as their Army evangelist, said in an email to Federal News Network that getting his email through is phone makes his reserve job easier.nn\u201cAs a senior noncommissioned officer, there\u2019s a lot of coordination I have to do and being able to do that without being on an Army computer makes my day a lot easier. If I\u2019m not on my personal computer, a whole day could go by without knowing I had an important email,\u201d he said. \u201cNow, I\u2019m no longer causing delays and people aren\u2019t waiting on me.\u201dn<h2>Transitioning from JRSS to SD-WAN<\/h2>nEubanks said the Army is able to shut down this type of access because of the success of its VDI roll out over the past year or more.nnHe said the <a href="https:\/\/federalnewsnetwork.com\/army\/2022\/12\/army-goes-live-with-virtual-desktop-capability\/">number of users are increasing<\/a> and the number technology is proving itself out.nn\u201cWe are still testing a mobile access management solution for mobile use as well,\u201d Eubanks said.nnAlong with the VDI roll out, Eubanks said he also focused on the <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2023\/09\/a-dozen-or-more-pilots-advancing-disas-cyber-cloud-efforts\/">move away from<\/a> the Joint Regional Security Stacks (JRSS) and to a software-defined wide-area network. The Defense Information Systems Agency told the services it will shut down JRSS in 2027 so Eubanks said the Army is in the middle of planning to transition to the new capabilities over the next few years.nn\u201cAll the planning now is the goal and then [implementation] will really, really start in earnest probably in the fall. Behind the scenes what we're doing is we're asking all of our theater signal commands, all of our signal brigades and the network enterprise centers, as DISA looks down to shut down the JRSS, here's the services it gives us, what it means to you and what your timeline looks like to move off of it,\u201d he said."}};

PHILADELPHIA — The Army is making a major change to how soldiers and civilians access data through their email and other applications in early June.

Starting on June 11, the Army is shutting down the network port that lets users pull data through commercial internet providers onto their laptops or cell phones.

Maj. Gen. Chris Eubank, commander of the Network Enterprise Technology Command (NETCOM), said the decision to turn off what is commonly known as “Flow 3” came down to two factors. One is basic cybersecurity and protecting data and networks. The second, however, was the maturity of the Army’s virtual desktop initiative (VDI) and overall network architecture.

Maj. Gen. Chris Eubank is the commander of the Network Enterprise Technology Command (NETCOM).

“What we’re really going to shut down is the ability to go into the Army’s network and pull the information through the internet to your device, whether it’s a government furnished device or a personal device. What we’re doing is we’re going to cut off that access so you’ll still be able to get to those services, via your personal device using a Common Access Card (CAC) or from a government furnished piece of equipment using a CAC using the commercial internet, but it’s all going to be through virtual means,” said Eubank during an interview with Federal News Network at the Army TEMS conference last week. “Using technologies in our bring-your-own-device, remote capable workforce portfolio like Azure virtual desktop, individuals will still be able to plug in via the commercial internet CAC enabled get to that information, but they will not be able to pull the information out of that environment. It will stay resident in in the cloud. When they disconnect their session, there’s nothing left behind [on the device]. It’s really about protecting both the network and our workforce.”

The Army has provided this type of access through the commercial internet for years, specifically for members of the guard and reserves. This capability became even more critical during the pandemic when more soldiers and civilians worked remotely.

Army needs to increase data protections

Eubank said because the threat landscape has changed so dramatically in the last three or four years, the Army made the decision to shut off the ability to download data through commercial internet providers. Eubank signed a strategic communications message in the beginning of May to initiate this change.

He said NETCOM is trying to make this transition easier for soldiers and civilians by providing them with a QR code to download the VDI application.

“They can click on a link and it’ll sign them up for Azure virtual desktop. They can do the same thing on Hypori. That enables them to get that account setup and then if you have any questions all you have to do is reach out to NETCOM,” he said.

Jared Shepard, CEO and President of Hypori, said in an email to Federal News Network that these steps will make a big difference for how the Army provides secure access unclassified network resources, while also reducing the attack surface and potential loss of controlled unclassified information (CUI) data.

Jeff Duran,  an Army Reservist who also serves as a contractor to Hypori as their Army evangelist, said in an email to Federal News Network that getting his email through is phone makes his reserve job easier.

“As a senior noncommissioned officer, there’s a lot of coordination I have to do and being able to do that without being on an Army computer makes my day a lot easier. If I’m not on my personal computer, a whole day could go by without knowing I had an important email,” he said. “Now, I’m no longer causing delays and people aren’t waiting on me.”

Transitioning from JRSS to SD-WAN

Eubanks said the Army is able to shut down this type of access because of the success of its VDI roll out over the past year or more.

He said the number of users are increasing and the number technology is proving itself out.

“We are still testing a mobile access management solution for mobile use as well,” Eubanks said.

Along with the VDI roll out, Eubanks said he also focused on the move away from the Joint Regional Security Stacks (JRSS) and to a software-defined wide-area network. The Defense Information Systems Agency told the services it will shut down JRSS in 2027 so Eubanks said the Army is in the middle of planning to transition to the new capabilities over the next few years.

“All the planning now is the goal and then [implementation] will really, really start in earnest probably in the fall. Behind the scenes what we’re doing is we’re asking all of our theater signal commands, all of our signal brigades and the network enterprise centers, as DISA looks down to shut down the JRSS, here’s the services it gives us, what it means to you and what your timeline looks like to move off of it,” he said.

The post Army turning up cyber protections of network, data access first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/army/2024/05/army-turning-up-cyber-protections-of-network-data-access/feed/ 0
How the pandemic changed IRS technology for good https://federalnewsnetwork.com/ask-the-cio/2024/05/how-the-pandemic-changed-irs-technology-for-good/ https://federalnewsnetwork.com/ask-the-cio/2024/05/how-the-pandemic-changed-irs-technology-for-good/#respond Wed, 29 May 2024 13:01:03 +0000 https://federalnewsnetwork.com/?p=5018259 Former IRS CIO Nancy Sieger, who will retire on June 1 after more than 40 years in government, said she found success during the pandemic by managing its risks.

The post How the pandemic changed IRS technology for good first appeared on Federal News Network.

]]>
var config_5019162 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB1630036841.mp3?updated=1716987467"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"How the pandemic changed IRS technology for good","description":"[hbidcpodcast podcastid='5019162']nnThrough the pandemic, the IRS learned it can move with urgency. And now that the emergency has subsided, Nancy Sieger, the former IRS chief information officer, believes that lesson isn\u2019t going to waste.nn[caption id="attachment_4491053" align="alignright" width="228"]<img class="size-full wp-image-4491053" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/03\/nancy-sieger.jpg" alt="" width="228" height="296" \/> Nancy Sieger is retiring from federal service after serving as the IRS CIO and Treasury Department's CTO.[\/caption]nnSieger, who will retire on June 1 after more than 40 years of federal service, including the last one as the Treasury chief technology officer, said IRS is building on the IT modernization lessons learned over the past few years.nn\u201cI think technologists saved the day during the pandemic. As the IRS CIO, I had the opportunity to lead IRS efforts to ensure that services to the public were handled in the most efficient way possible. If you think back to that time, businesses shut down, cities were practically shut down, and our economy was suffering and human beings were suffering. IRS focused really hard to issue three rounds of Economic Impact Payments. I am most proud of how IRS leadership and employees rallied to get money to the people in this country who needed it the most,\u201d Sieger said during an \u201cexit\u201d interview on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe had a principle that any new technology would be built in a modernized way. We were really good at relying on the older systems and delivering fast. One of the opportunities we had with the <a href="https:\/\/federalnewsnetwork.com\/management\/2020\/10\/pandemic-workload-brought-irs-to-the-limit-of-doing-more-with-less\/">Economic Impact Payments<\/a>, looking to the future, feeling like IRS might be called upon again to do something similar. We had to challenge ourselves to say that may be easy and fast to build upon old operations, but how do we do this in a modernized way so that it's repeatable? There were three rounds of payments, each round of payments came faster and faster, culminating within 24 hours. The Economic Impact Payments and that processing were built using new tools, new testing methods, new quality assurance processes and built in a modernized way. If IRS has to do that again, the strong foundation will be there.\u201dnnSieger said it took constant reminders to build the confidence of the developers and engineers to the point where she and then-IRS Deputy CIO Kaschit Pandya, who is now the agency\u2019s CTO, met daily with the technology workers who were writing code and analyzing it.nn\u201cWe often had to say to our folks, \u2018no, no, you have my permission to do it this way. Not [the old] way. It was risky. We managed those risks,\u201d she said. \u201cBut ultimately, it resulted in little-to-no rework. I would say to you, on behalf of Kaschit and myself, the hours we spent with a team doing this the way it needed to be done was very fulfilling.\u201dn<h2>IRS can accept, manage risks<\/h2>nThat experience has helped the IRS continue to launch modern services, such as the direct file application, <a href="https:\/\/federalnewsnetwork.com\/technology-main\/2024\/03\/the-irs-launches-direct-file-a-pilot-program-for-free-online-tax-filing-available-in-12-states\/">launched in March<\/a> across 12 states. The IRS said the <a href="https:\/\/directfile.irs.gov\/" target="_blank" rel="noopener">direct file pilot<\/a> helped more than 140,000 citizens file their taxes online and for free.nnThere are plenty more opportunities for the technology development lessons learned from the pandemic to continue to spread across the IRS. Commissioner Danny Werfel told lawmakers in April that the tax agency <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2024\/04\/irs-seeks-104b-for-multi-year-modernization-fund-to-maintain-customer-service-improvements\/">needs $104 billion<\/a> for a multi-year modernization effort.nnSieger said the experience over the last three-plus years <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2020\/11\/rettig-says-pandemic-gave-irs-momentum-to-overhaul-taxpayer-services\/">taught the IRS<\/a> it can accept and manage risks differently than before.nn\u201cWe took a lot of risks. We weighed those risks. We said, \u2018the worst thing that could happen is this. What are we going to do when that happens?\u2019\u201d she said. \u201cI think our greatest opportunity is not forgetting how we did that, and bringing that forward into future operations. I'm trying not to say don't be risk averse, but I'm going to say it. Don't be risk averse and accept measured risk; know what could happen, know how you'll adapt, but let's face it, in our personal lives, especially in the technology space, how many of us get an update on our smartphone that didn't work. But we know the next day it will be updated and fixed. Now I am not suggesting something so aggressive in government. But I am suggesting that we look back to how the government served this country during the pandemic and bring some of those skills and learnings forward to be even more effective and efficient in government service.\u201dnnOne of the biggest reasons for the IRS\u2019 success, beyond the urgency of the moment, was the top-cover leaders gave the developers. Sieger said helping employees reduce the fear of failure and ensuring they know they are not going to be left behind should something go wrong was a huge factor in the agency\u2019s success.nn\u201cAt the time, it was Commissioner Charles Rettig who was constantly keeping his hand on the pulse of the employees, working with Treasury to ensure that we were delivering the payments and processing tax returns and the IT workforce knew they had support. They were constantly asked, \u2018What do you need?\u2019 Sometimes they would tell us what they needed. Sometimes, I saw what they needed, and they wouldn't ask. There was a particular weekend where the team was working really hard,\u201d she said. \u201cThis was not a case of the workforce being hesitant to do new things. This was a case of the workforce having the skills they needed to do this in the most elegant way, and once leadership let them know \u2014 from Commissioner Rettig through the different deputy commissioners to myself and all the front line executives at the IRS who helped them \u2014 they were able to get things done and help the country. It was an example of coming together at the right time in the right way for the right outcome.\u201dnn nn "}};

Through the pandemic, the IRS learned it can move with urgency. And now that the emergency has subsided, Nancy Sieger, the former IRS chief information officer, believes that lesson isn’t going to waste.

Nancy Sieger is retiring from federal service after serving as the IRS CIO and Treasury Department’s CTO.

Sieger, who will retire on June 1 after more than 40 years of federal service, including the last one as the Treasury chief technology officer, said IRS is building on the IT modernization lessons learned over the past few years.

“I think technologists saved the day during the pandemic. As the IRS CIO, I had the opportunity to lead IRS efforts to ensure that services to the public were handled in the most efficient way possible. If you think back to that time, businesses shut down, cities were practically shut down, and our economy was suffering and human beings were suffering. IRS focused really hard to issue three rounds of Economic Impact Payments. I am most proud of how IRS leadership and employees rallied to get money to the people in this country who needed it the most,” Sieger said during an “exit” interview on Ask the CIO. “We had a principle that any new technology would be built in a modernized way. We were really good at relying on the older systems and delivering fast. One of the opportunities we had with the Economic Impact Payments, looking to the future, feeling like IRS might be called upon again to do something similar. We had to challenge ourselves to say that may be easy and fast to build upon old operations, but how do we do this in a modernized way so that it’s repeatable? There were three rounds of payments, each round of payments came faster and faster, culminating within 24 hours. The Economic Impact Payments and that processing were built using new tools, new testing methods, new quality assurance processes and built in a modernized way. If IRS has to do that again, the strong foundation will be there.”

Sieger said it took constant reminders to build the confidence of the developers and engineers to the point where she and then-IRS Deputy CIO Kaschit Pandya, who is now the agency’s CTO, met daily with the technology workers who were writing code and analyzing it.

“We often had to say to our folks, ‘no, no, you have my permission to do it this way. Not [the old] way. It was risky. We managed those risks,” she said. “But ultimately, it resulted in little-to-no rework. I would say to you, on behalf of Kaschit and myself, the hours we spent with a team doing this the way it needed to be done was very fulfilling.”

IRS can accept, manage risks

That experience has helped the IRS continue to launch modern services, such as the direct file application, launched in March across 12 states. The IRS said the direct file pilot helped more than 140,000 citizens file their taxes online and for free.

There are plenty more opportunities for the technology development lessons learned from the pandemic to continue to spread across the IRS. Commissioner Danny Werfel told lawmakers in April that the tax agency needs $104 billion for a multi-year modernization effort.

Sieger said the experience over the last three-plus years taught the IRS it can accept and manage risks differently than before.

“We took a lot of risks. We weighed those risks. We said, ‘the worst thing that could happen is this. What are we going to do when that happens?’” she said. “I think our greatest opportunity is not forgetting how we did that, and bringing that forward into future operations. I’m trying not to say don’t be risk averse, but I’m going to say it. Don’t be risk averse and accept measured risk; know what could happen, know how you’ll adapt, but let’s face it, in our personal lives, especially in the technology space, how many of us get an update on our smartphone that didn’t work. But we know the next day it will be updated and fixed. Now I am not suggesting something so aggressive in government. But I am suggesting that we look back to how the government served this country during the pandemic and bring some of those skills and learnings forward to be even more effective and efficient in government service.”

One of the biggest reasons for the IRS’ success, beyond the urgency of the moment, was the top-cover leaders gave the developers. Sieger said helping employees reduce the fear of failure and ensuring they know they are not going to be left behind should something go wrong was a huge factor in the agency’s success.

“At the time, it was Commissioner Charles Rettig who was constantly keeping his hand on the pulse of the employees, working with Treasury to ensure that we were delivering the payments and processing tax returns and the IT workforce knew they had support. They were constantly asked, ‘What do you need?’ Sometimes they would tell us what they needed. Sometimes, I saw what they needed, and they wouldn’t ask. There was a particular weekend where the team was working really hard,” she said. “This was not a case of the workforce being hesitant to do new things. This was a case of the workforce having the skills they needed to do this in the most elegant way, and once leadership let them know — from Commissioner Rettig through the different deputy commissioners to myself and all the front line executives at the IRS who helped them — they were able to get things done and help the country. It was an example of coming together at the right time in the right way for the right outcome.”

 

 

The post How the pandemic changed IRS technology for good first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/05/how-the-pandemic-changed-irs-technology-for-good/feed/ 0
Air Force increasing cloud capabilities for the warfighter https://federalnewsnetwork.com/ask-the-cio/2024/05/air-force-expanding-cloud-as-operational-tactical-lines-blur/ https://federalnewsnetwork.com/ask-the-cio/2024/05/air-force-expanding-cloud-as-operational-tactical-lines-blur/#respond Thu, 16 May 2024 16:14:53 +0000 https://federalnewsnetwork.com/?p=5003903 Venice Goodwine, the Air Force’s CIO, said one goal is to create more transparency on how much money mission owners are spending on cloud services.

The post Air Force increasing cloud capabilities for the warfighter first appeared on Federal News Network.

]]>
var config_5004140 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB8481707563.mp3?updated=1715875305"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"Air Force expanding cloud as operational, tactical lines blur","description":"[hbidcpodcast podcastid='5004140']nnThe Department of the Air Force\u2019s chief information officer\u2019s strategy to increase the capabilities of its airmen and women and guardians is centered on increasing the use of cloud services.nnVenice Goodwine, the Air Force\u2019s CIO, said the cloud cannot be thought of as just for business applications. The lines between the back office and the tactical edge have blurred, she said.nn[caption id="attachment_5003910" align="alignright" width="260"]<img class="wp-image-5003910 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/05\/venice-goodwine-2-scaled.jpg" alt="" width="260" height="325" \/> Venice Goodwine is the Department of the Air Force\u2019s chief information officer.[\/caption]nn\u201cI\u2019m expanding the cloud from NIPERNet [unclassified network] to SIPRNet [classified network] and also having all those capabilities as well in that cloud on both sides. As we think about the different classifications, how do we get there with those same human-to-human capabilities are important?\u201d said Goodwine said at the recent AFCEA NOVA Air Force IT Day, an excerpt of which was played on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cThe other thing when I'm thinking of the cloud, it's an investment. But I'm also going to create the transparency that we haven't seen before in the cloud. Now when I think financial operations in the cloud, I now can talk to my system owners about their investment in the cloud, tell them when to pay for reserve instances. I could talk to them about how can they make adjustments in their investment based on the usage or their computing and storage? I didn't have that visibility before.\u201dnnThe Air Force is planning to have a single tenet for Office 365 on the secret side, which is different than what the service did with its unclassified version, which had multiple tenetsnnSeveral other <a href="https:\/\/federalnewsnetwork.com\/on-dod\/2023\/10\/secret-level-version-of-microsoft-365-rolls-out-to-top-pentagon-offices-this-month\/">military services and agencies<\/a> also have rolled out O365 on the secret side recently.nn\u201cWhat's important for my cloud strategy is making sure that I have cloud at the tactical edge. That's my reliance on commercial cloud services at the edge because if I'm going to have decision advantage, I have to make sure that the data is available. The data needs to be where the warfighter is and the data needs to be in the cloud,\u201d Goodwine said. \u201cI don't intend to put the data in the continental United States (CONUS) when I'm fighting in INDOPACOM. I need the data there. But then I also need the cloud at the edge. I need the data at the edge. I need artificial intelligence to make sense of the data. And it needs to be trusted. So all the attributes, you talk about data, I need all of that there. So it's not just enterprise IT. It is it for the warfighter. That's my mantra and you'll hear me say that all the time and my team speak that same language.\u201dn<h2>Air Force expanding virtual environment<\/h2>nThe Air Force continues to mature its approach to buying cloud services. Goodwine, who <a href="https:\/\/federalnewsnetwork.com\/air-force\/2023\/08\/air-force-names-new-cio\/">became the CIO<\/a> in August, said the Joint Warfighting Cloud Capability (JWCC) remains the first option of where to buy cloud services, especially for new workloads. But, she said, those workloads and applications will remain in the CloudOne platform.nnThe Air Force is working on a new solicitation for CloudOne, called <a href="https:\/\/federalnewsnetwork.com\/air-force\/2024\/04\/air-force-begins-phase-2-of-enterprise-it-service-delivery\/">CloudOne Next<\/a>.nnThe Air Force released its request for information for CloudOne Next in September and just in March, it offered more details on its\u00a0<a href="https:\/\/sam.gov\/opp\/d4ff2b612d5e4b81ad6534dccc2af336\/view" target="_blank" rel="noopener">acquisition strategy<\/a>.nnThe Air Force expects to release three solicitations for CloudOne Next in the third quarter of 2024 and make the award in the fourth quarter of this year. It will be three single-award blanket purchase agreements on top of the schedules program run by the General Services Administration.nnAs part of this cloud expansion, Goodwine said the Air Force is developing a virtual environment to make it easier to access applications in a secure way.nn\u201cIf you're on your home computer, you have a Mac, you can go to portal.apps.mil and you can access your O365.You can be as productive as you need to be. There is no need for you to VPN in and you can use your home network,\u201d she said. \u201cYou want to be able to access your OneDrive, all your apps and email, you can do that today. You only VPN in because you're trying to get to some shared drives that we're going to shut down eventually anyway. So really, those are the things that we already have in play that we should take advantage of, especially now that we're in a hybrid environment. As we move forward, yes, understanding the work that's done, the hours required to do that work so that we can make better investment decisions about the technology that we want to use, so I do think there's a connection between technology and people hours.\u201dnnAdditionally, Goodwine said the <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2024\/03\/air-force-intelligence-cio-finding-ways-to-get-to-yes\/">Air Force will expand<\/a> its \u201cDesktop Anywhere\u201d initiative beyond just the Air Force Reserve Command.nn\u201cIt now has an Impact Level 5 authority to operate, and we're going to move it [off-premise] so we're expanding that. We'll have the ability to do more of these virtualized environments,\u201d she said. \u201cFrom a cybersecurity perspective, it\u2019s a great idea because I just reduced my attack surface and from a productivity perspective, it\u2019s absolutely faster, better, cheaper, and it now really allows you to be mobile, which is what I want my workforce to be the airmen and guardians.\u201d"}};

The Department of the Air Force’s chief information officer’s strategy to increase the capabilities of its airmen and women and guardians is centered on increasing the use of cloud services.

Venice Goodwine, the Air Force’s CIO, said the cloud cannot be thought of as just for business applications. The lines between the back office and the tactical edge have blurred, she said.

Venice Goodwine is the Department of the Air Force’s chief information officer.

“I’m expanding the cloud from NIPERNet [unclassified network] to SIPRNet [classified network] and also having all those capabilities as well in that cloud on both sides. As we think about the different classifications, how do we get there with those same human-to-human capabilities are important?” said Goodwine said at the recent AFCEA NOVA Air Force IT Day, an excerpt of which was played on Ask the CIO. “The other thing when I’m thinking of the cloud, it’s an investment. But I’m also going to create the transparency that we haven’t seen before in the cloud. Now when I think financial operations in the cloud, I now can talk to my system owners about their investment in the cloud, tell them when to pay for reserve instances. I could talk to them about how can they make adjustments in their investment based on the usage or their computing and storage? I didn’t have that visibility before.”

The Air Force is planning to have a single tenet for Office 365 on the secret side, which is different than what the service did with its unclassified version, which had multiple tenets

Several other military services and agencies also have rolled out O365 on the secret side recently.

“What’s important for my cloud strategy is making sure that I have cloud at the tactical edge. That’s my reliance on commercial cloud services at the edge because if I’m going to have decision advantage, I have to make sure that the data is available. The data needs to be where the warfighter is and the data needs to be in the cloud,” Goodwine said. “I don’t intend to put the data in the continental United States (CONUS) when I’m fighting in INDOPACOM. I need the data there. But then I also need the cloud at the edge. I need the data at the edge. I need artificial intelligence to make sense of the data. And it needs to be trusted. So all the attributes, you talk about data, I need all of that there. So it’s not just enterprise IT. It is it for the warfighter. That’s my mantra and you’ll hear me say that all the time and my team speak that same language.”

Air Force expanding virtual environment

The Air Force continues to mature its approach to buying cloud services. Goodwine, who became the CIO in August, said the Joint Warfighting Cloud Capability (JWCC) remains the first option of where to buy cloud services, especially for new workloads. But, she said, those workloads and applications will remain in the CloudOne platform.

The Air Force is working on a new solicitation for CloudOne, called CloudOne Next.

The Air Force released its request for information for CloudOne Next in September and just in March, it offered more details on its acquisition strategy.

The Air Force expects to release three solicitations for CloudOne Next in the third quarter of 2024 and make the award in the fourth quarter of this year. It will be three single-award blanket purchase agreements on top of the schedules program run by the General Services Administration.

As part of this cloud expansion, Goodwine said the Air Force is developing a virtual environment to make it easier to access applications in a secure way.

“If you’re on your home computer, you have a Mac, you can go to portal.apps.mil and you can access your O365.You can be as productive as you need to be. There is no need for you to VPN in and you can use your home network,” she said. “You want to be able to access your OneDrive, all your apps and email, you can do that today. You only VPN in because you’re trying to get to some shared drives that we’re going to shut down eventually anyway. So really, those are the things that we already have in play that we should take advantage of, especially now that we’re in a hybrid environment. As we move forward, yes, understanding the work that’s done, the hours required to do that work so that we can make better investment decisions about the technology that we want to use, so I do think there’s a connection between technology and people hours.”

Additionally, Goodwine said the Air Force will expand its “Desktop Anywhere” initiative beyond just the Air Force Reserve Command.

“It now has an Impact Level 5 authority to operate, and we’re going to move it [off-premise] so we’re expanding that. We’ll have the ability to do more of these virtualized environments,” she said. “From a cybersecurity perspective, it’s a great idea because I just reduced my attack surface and from a productivity perspective, it’s absolutely faster, better, cheaper, and it now really allows you to be mobile, which is what I want my workforce to be the airmen and guardians.”

The post Air Force increasing cloud capabilities for the warfighter first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/05/air-force-expanding-cloud-as-operational-tactical-lines-blur/feed/ 0
Army changing the color of money used to modernize software https://federalnewsnetwork.com/army/2024/05/army-changing-the-color-of-money-used-to-modernize-software/ https://federalnewsnetwork.com/army/2024/05/army-changing-the-color-of-money-used-to-modernize-software/#respond Tue, 14 May 2024 15:58:58 +0000 https://federalnewsnetwork.com/?p=5000433 The Army will keep most software development efforts in ongoing development mode and not transition them to sustainment as part of its modernization efforts.

The post Army changing the color of money used to modernize software first appeared on Federal News Network.

]]>
var config_5001968 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB6539456244.mp3?updated=1715759689"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Army changing the color of money used to modernize software","description":"[hbidcpodcast podcastid='5001968']nnWhen it comes to software development, the Army is going to stop worrying about the color of money.nnThat\u2019s because as part of its new approach to software modernization, the Army is rethinking what sustainment means.nnMargaret Boatner is the deputy assistant secretary of the Army for strategy and acquisition reform, said one of the main tenets of the policy signed by Army Secretary Christine Wormuth in March is to reform several legacy processes that is keeping the service from <a href="https:\/\/federalnewsnetwork.com\/cloud-computing\/2024\/03\/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation\/">adopting modern software development<\/a> approaches.nn[caption id="attachment_4434599" align="alignright" width="300"]<img class="size-medium wp-image-4434599" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/01\/margaret-boatner-e1673995409964-300x225.jpg" alt="" width="300" height="225" \/> Margaret Boatner, deputy assistant secretary of the Army for strategy and acquisition reform[\/caption]nn\u201cWe are targeting a couple of really key processes like our test and evaluation processes, and importantly, our cybersecurity processes. We really are trying to modernize and streamline those as well as changing the way we think about sustainment because software is really never done. We really have to retrain ourselves to think about and to acknowledge the fact that software really needs to stay in development all the time,\u201d Boatner said in an exclusive interview with Federal News Network. \u201cRight now, our systems and our acquisition programs, once they're done being developed, they go through a process that we call transition to sustainment, meaning they've been fully developed and are now going to live in our inventory for 10, 20, 30 years. We're going to sustain them for a long period of time. When a system makes that transition, the financial management regulations dictate that they use a certain color of money, operations and maintenance dollars. With that color of money, we can really only do minor patches, fixes and bug updates. So that's an example of a legacy process that, when you're talking about a software system, really tied our arms behind our back. It really prevented us from doing true development over the long term with the software solutions.\u201dnnBoatner said under the new policy, software will no longer make the transition to sustainment. Instead, the program office will keep operating under research, development, test and evaluation (RDT&E) funding.nn\u201cIt\u2019s recognizing that a continuous integration\/continuous delivery (CI\/CD) model software is never done. That way, our program managers can plan to use the appropriate color of money, which in many cases might be RDT&E, which is the color money you need to do true development,\u201d she said. \u201cSo, that will give our program managers a lot more flexibility to determine the appropriate color money based on what they want to do, such that our software systems can really continue to be developed over time.\u201dnnThe Army has been on this path to software modernization path for several years, with it culminating with the <a href="https:\/\/www.army.mil\/article\/274356\/army_announces_new_policy_to_drive_adoption_of_agile_software_development_practices" target="_blank" rel="noopener">March memo<\/a>.nnWith the lessons from the <a href="https:\/\/federalnewsnetwork.com\/army\/2023\/10\/army-turning-lessons-learned-from-11-software-pathway-pilots-into-new-policies\/">11 software pathways<\/a> to testing out a new approach to a continuous authority to operate to the broad adoption of the <a href="Adaptive%20Acquisition%20Framework" target="_blank" rel="noopener">Adaptive Acquisition Framework<\/a>, Boatner and Leo Garciga, the Army\u2019s chief information officer, are clearing obstacles, modernizing policies and attempting to change the culture of how the Army buys, builds and manages software.n<h2>Army updating ATO policy<\/h2>nGarciga said by keeping programs under the RDT&E bucket, the Army is recognizing the other changes it needs to complete to make these efforts more successful.nn\u201cWe need to relook at processes like interoperability. Historically, that was not a parallel process, but definitely a series process. How do we change the way we look at that to bring it into this model where we're developing at speed and scale all the time?\u201d he said. \u201cI think we're starting to see the beginnings of the second- and third-order effects of some of these decisions. The software directive really encapsulated some big rocks that need to move. We're finding things in our processes that we're going to have to quickly change to get to the end state we're looking for.\u201dnnSince taking over the CIO role in July, Garciga has been on a mission to <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2023\/10\/army-cio-garciga-kicks-off-tenure-by-simplifying-cloud-software-development\/">modernize IT policies<\/a> that are standing in the way. The latest one is around a continuous ATO (C-ATO).nnHe said the new policy could be out later this summer.nn\u201cWe've told folks to do DevSecOps and to bring agile into how they deliver software, so how do we accredit that? How do we certify that? What does that model look like? We're hyper-focused on building out a framework that we can push out to the entire Army,\u201d Garciga said. \u201cWhether you're at a program of record, or you're sitting at an Army command, who has an enterprise capability, we will give some guidelines on how we do that, or at least an initial operational framework that says these are the basic steps you need to be certified to do DevSecOps, which really gets to the end state that we're shooting for.\u201dnnHe added the current approach to obtaining an ATO is too compliance focused and not risk based.n<h2>Pilot demonstrated what is possible<\/h2>nGarciga highlighted a recent example of the barriers to getting C-ATO.nn\u201cWe started looking at some initial programs with a smart team and we found some interesting things. There was some things that were holding us back like a program that was ready to do CI\/CD and actually could do releases every day, but because of interoperability testing and the nature of how we were implementing that in the Army, it was causing them to only release two times a year, which is insane,\u201d he said. \u201cWe very quickly got together and rewickered the entire approach for how we were going to do interoperability testing inside the Army. We're hoping that leads to the department also taking a look at that as we look at the joint force and joint interoperability and maybe they follow our lead, so we can break down some of those barriers.\u201dnnAdditionally, the Army undertook a pilot to test out this new C-ATO approach.nnGarciga said the test case proved a program could receive at least an initial C-ATO in less than 90 days by bringing in red and purple teams to review the code.nn\u201cI'd say about three months ago, we actually slimmed down the administrative portion and focused on what were the things that would allow us to protect our data, protect access to a system and make a system survivable. We really condensed down the entire risk management framework (RMF) process to six critical controls,\u201d he said. \u201cOn top of that, we added a red team and a purple team to actually do penetration testing in real time against that system as it was deployed in production. What that did is it took our entire time from no ATO to having at least an ATO with conditions down to about less than 90 days. That was really our first pilot to see if we can we actually do this, and what are our challenges in doing that.\u201dnnGarciga said one of the big challenges that emerged was the need to train employees to take a more threat-based approach to ATOs. Another challenge that emerged was the Army applied its on-premise ATO approach to the cloud, which Garciga said didn\u2019t make a lot of sense.nn\u201cWe put some new policy out to really focus on what it means to accredit cloud services and to make that process a lot easier. One of our pilots, as we looked at how do we speed up the process and get someone to a viable CI\/CD pipeline, we found things that were really in the way like interoperability testing and how do we get that out of the way and streamline that process,\u201d he said. \u201cIn our pilots, the one part that we did find very interesting was this transition of our security control assessors from folks that have historically looked at some very specific paperwork to actually now getting on a system and looking at code, looking at triggers that have happened inside some of our CI\/CD tools and making very difficult threshold decisions based on risk and risk that an authorizing official would take to make those decisions. We're still very much working on what our training plan would be around that piece. That'll be a big portion of how we're going to certify CI\/CD work and DevSecOps pipelines in the Army moving forward.\u201d"}};

When it comes to software development, the Army is going to stop worrying about the color of money.

That’s because as part of its new approach to software modernization, the Army is rethinking what sustainment means.

Margaret Boatner is the deputy assistant secretary of the Army for strategy and acquisition reform, said one of the main tenets of the policy signed by Army Secretary Christine Wormuth in March is to reform several legacy processes that is keeping the service from adopting modern software development approaches.

Margaret Boatner, deputy assistant secretary of the Army for strategy and acquisition reform

“We are targeting a couple of really key processes like our test and evaluation processes, and importantly, our cybersecurity processes. We really are trying to modernize and streamline those as well as changing the way we think about sustainment because software is really never done. We really have to retrain ourselves to think about and to acknowledge the fact that software really needs to stay in development all the time,” Boatner said in an exclusive interview with Federal News Network. “Right now, our systems and our acquisition programs, once they’re done being developed, they go through a process that we call transition to sustainment, meaning they’ve been fully developed and are now going to live in our inventory for 10, 20, 30 years. We’re going to sustain them for a long period of time. When a system makes that transition, the financial management regulations dictate that they use a certain color of money, operations and maintenance dollars. With that color of money, we can really only do minor patches, fixes and bug updates. So that’s an example of a legacy process that, when you’re talking about a software system, really tied our arms behind our back. It really prevented us from doing true development over the long term with the software solutions.”

Boatner said under the new policy, software will no longer make the transition to sustainment. Instead, the program office will keep operating under research, development, test and evaluation (RDT&E) funding.

“It’s recognizing that a continuous integration/continuous delivery (CI/CD) model software is never done. That way, our program managers can plan to use the appropriate color of money, which in many cases might be RDT&E, which is the color money you need to do true development,” she said. “So, that will give our program managers a lot more flexibility to determine the appropriate color money based on what they want to do, such that our software systems can really continue to be developed over time.”

The Army has been on this path to software modernization path for several years, with it culminating with the March memo.

With the lessons from the 11 software pathways to testing out a new approach to a continuous authority to operate to the broad adoption of the Adaptive Acquisition Framework, Boatner and Leo Garciga, the Army’s chief information officer, are clearing obstacles, modernizing policies and attempting to change the culture of how the Army buys, builds and manages software.

Army updating ATO policy

Garciga said by keeping programs under the RDT&E bucket, the Army is recognizing the other changes it needs to complete to make these efforts more successful.

“We need to relook at processes like interoperability. Historically, that was not a parallel process, but definitely a series process. How do we change the way we look at that to bring it into this model where we’re developing at speed and scale all the time?” he said. “I think we’re starting to see the beginnings of the second- and third-order effects of some of these decisions. The software directive really encapsulated some big rocks that need to move. We’re finding things in our processes that we’re going to have to quickly change to get to the end state we’re looking for.”

Since taking over the CIO role in July, Garciga has been on a mission to modernize IT policies that are standing in the way. The latest one is around a continuous ATO (C-ATO).

He said the new policy could be out later this summer.

“We’ve told folks to do DevSecOps and to bring agile into how they deliver software, so how do we accredit that? How do we certify that? What does that model look like? We’re hyper-focused on building out a framework that we can push out to the entire Army,” Garciga said. “Whether you’re at a program of record, or you’re sitting at an Army command, who has an enterprise capability, we will give some guidelines on how we do that, or at least an initial operational framework that says these are the basic steps you need to be certified to do DevSecOps, which really gets to the end state that we’re shooting for.”

He added the current approach to obtaining an ATO is too compliance focused and not risk based.

Pilot demonstrated what is possible

Garciga highlighted a recent example of the barriers to getting C-ATO.

“We started looking at some initial programs with a smart team and we found some interesting things. There was some things that were holding us back like a program that was ready to do CI/CD and actually could do releases every day, but because of interoperability testing and the nature of how we were implementing that in the Army, it was causing them to only release two times a year, which is insane,” he said. “We very quickly got together and rewickered the entire approach for how we were going to do interoperability testing inside the Army. We’re hoping that leads to the department also taking a look at that as we look at the joint force and joint interoperability and maybe they follow our lead, so we can break down some of those barriers.”

Additionally, the Army undertook a pilot to test out this new C-ATO approach.

Garciga said the test case proved a program could receive at least an initial C-ATO in less than 90 days by bringing in red and purple teams to review the code.

“I’d say about three months ago, we actually slimmed down the administrative portion and focused on what were the things that would allow us to protect our data, protect access to a system and make a system survivable. We really condensed down the entire risk management framework (RMF) process to six critical controls,” he said. “On top of that, we added a red team and a purple team to actually do penetration testing in real time against that system as it was deployed in production. What that did is it took our entire time from no ATO to having at least an ATO with conditions down to about less than 90 days. That was really our first pilot to see if we can we actually do this, and what are our challenges in doing that.”

Garciga said one of the big challenges that emerged was the need to train employees to take a more threat-based approach to ATOs. Another challenge that emerged was the Army applied its on-premise ATO approach to the cloud, which Garciga said didn’t make a lot of sense.

“We put some new policy out to really focus on what it means to accredit cloud services and to make that process a lot easier. One of our pilots, as we looked at how do we speed up the process and get someone to a viable CI/CD pipeline, we found things that were really in the way like interoperability testing and how do we get that out of the way and streamline that process,” he said. “In our pilots, the one part that we did find very interesting was this transition of our security control assessors from folks that have historically looked at some very specific paperwork to actually now getting on a system and looking at code, looking at triggers that have happened inside some of our CI/CD tools and making very difficult threshold decisions based on risk and risk that an authorizing official would take to make those decisions. We’re still very much working on what our training plan would be around that piece. That’ll be a big portion of how we’re going to certify CI/CD work and DevSecOps pipelines in the Army moving forward.”

The post Army changing the color of money used to modernize software first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/army/2024/05/army-changing-the-color-of-money-used-to-modernize-software/feed/ 0
OMB forms replacement for FedRAMP JAB https://federalnewsnetwork.com/cybersecurity/2024/05/omb-forms-replacement-for-fedramp-jab/ https://federalnewsnetwork.com/cybersecurity/2024/05/omb-forms-replacement-for-fedramp-jab/#respond Wed, 08 May 2024 21:58:11 +0000 https://federalnewsnetwork.com/?p=4994015 The Office of Management and Budget selected CIOs, CISOs and other technology experts to be part of the new FedRAMP Board, which replaces the JAB.

The post OMB forms replacement for FedRAMP JAB first appeared on Federal News Network.

]]>
The Office of Management and Budget took a major step in the revamping of the cloud security program called FedRAMP.

OMB last week officially created the replacement for the Joint Authorization Board (JAB), called the FedRAMP Board. The new board will provide executive oversight and governance of the program.

An OMB spokesperson says the board, which is made up of seven people, including legislatively-mandated representatives from the General Services Administration, and the departments of Defense and Homeland Security, also includes representatives from the Department of Veterans Affairs (VA), the Department of the Air Force, the Cybersecurity and Infrastructure Agency (CISA) and the Federal Deposit Insurance Corporation (FDIC). Experts from GSA, DoD and DHS made up the JAB from the start.

“One of our key priorities in selecting members of the FedRAMP Board is to strike the right balance between retaining experience and institutional knowledge from agencies that were part of the Joint Authorization Board (JAB) while also including diverse agency viewpoints into the FedRAMP strategic setting process,” said Drew Myklegard, deputy federal chief information officer in OMB, in an email to Federal News Network.

New policy still in draft

OMB initially introduced the idea of the FedRAMP Board as part of its draft policy update released in October. The spokesperson didn’t offer any insight to when the OMB would issue the final memo.

But Federal CIO Clare Martorana said the new memo and related efforts come at a key time for FedRAMP, which is relying on guidance that is more than 10 years old.

“This is a pivotal moment to evolve the FedRAMP Program, aligning it with the dynamic cloud landscape of today and tomorrow,” Martorana said in a statement. “Our schedule included time for an inclusive and collaborative policy design process, where we actively solicited feedback from government agencies, industry, and the general public. By considering diverse perspectives, OMB will help to ensure that our new policy will stand the test of time.”

The Office of Information and Regulatory Affairs in OMB’s Regulations.gov website shows Martorana’s office received 290 comments on the draft guidance.

GSA today also added another piece to the FedRAMP revamp, making changes to the membership and chairperson of the Federal Secure Cloud Advisory Committee (FSCAC), which are effective May 15.

The FSCAC advises FedRAMP on the adoption, use, authorization, monitoring, acquisition and security of cloud computing products and services.

GSA named Larry Hale, GSA’s deputy assistant commissioner in the Office of Information Technology Category Management in the Federal Acquisition Service, the new chairman, and added two new industry members and extended two current committee members.

GSA established the FSCAC, which will hold its next meeting on May 20, in February 2023. Its recommendations complement the FedRAMP Technical Advisory Group, an advisory body of federal technical experts, as well as the FedRAMP Board.

Chairperson, vice chairperson to be named

While OMB sorts through the comments on the draft FedRAMP memo, it went ahead and replaced the JAB with new members.

OMB says the board CIOs, chief information security officers (CISOs) as well as a deputy CIO, whose focus is in engineering, and CISA’s technical director for cybersecurity.

OMB and GSA will each designate a non-voting member to be chairperson and vice chairperson of the board, who will manage its overall agenda.

The spokesperson said one of the board’s first actions will be to approve a charter that will finalize details around terms. In general, all members of the board will serve time-limited terms and are expected to rotate over time. DoD, DHS, and GSA will consistently have representation on the FedRAMP Board, as established by the FedRAMP Authorization Act.

The spokesperson says the board will have similar responsibilities as the JAB such as reviewing and approving FedRAMP policies and requirements. It will oversee the overall health and performance of FedRAMP, and will work within the federal community to expand the authorization capacity of the FedRAMP ecosystem

The board, however, is not expected to participate in the approval of individual authorization packages.

We are currently planning the inaugural FedRAMP Board meeting.  The FedRAMP Roadmap and feedback from the Federal Secure Cloud Advisory Committee (FSCAC) will inform the board’s overall agenda,” the OMB spokesperson said. “The FedRAMP Board’s early priorities will include ensuring a smooth transition from the JAB and its provisional authorizations and any work in progress that directly affects customers, engaging with the federal community to increase the number of FedRAMP authorizations performed by one or more agencies, and working with the FedRAMP program to support updated performance metrics, greater consistency across authorization processes and continuous monitoring, and other FedRAMP roadmap initiatives.”

The post OMB forms replacement for FedRAMP JAB first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/05/omb-forms-replacement-for-fedramp-jab/feed/ 0
DISA’s new five-year plan will consolidate support for the warfighter https://federalnewsnetwork.com/federal-newscast/2024/05/disas-new-five-year-plan-will-consolidate-support-for-the-warfighter/ https://federalnewsnetwork.com/federal-newscast/2024/05/disas-new-five-year-plan-will-consolidate-support-for-the-warfighter/#respond Fri, 03 May 2024 14:10:38 +0000 https://federalnewsnetwork.com/?p=4987044 The Defense Information Systems Agency details four strategic imperatives, six operational imperatives and eight goals.

The post DISA’s new five-year plan will consolidate support for the warfighter first appeared on Federal News Network.

]]>
var config_4987031 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB5547611039.mp3?updated=1714735507"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedNewscast1500-150x150.jpg","title":"DISA’s new five-year plan will consolidate support for the warfighter","description":"[hbidcpodcast podcastid='4987031']nn[federal_newscast]"}};
  • Lt. Gen. Bob Skinner adds another piece to the Defense Information Systems Agency's modernization puzzle. DISA's new five-year strategic plan is more about emphasizing and highlighting its current roadmap than making any new or dramatic changes. But DISA Director Air Force Lt. Gen. Bob Skinner said simplifying and consolidating support for the warfighter will continue to drive these initiatives. In the 2025 to 2029 strategic plan, DISA detailed four strategic imperatives, six operational imperatives and eight goals. DISA said it remains focused on several departmentwide, enterprise-level tools, such as by 2030 delivering a common IT environment, developing a DoD enterprise cloud environment and integrating identity, credential and access management and zero trust capabilities.
  • The IRS plans to keep adding more employees, but it also needs more money to keep them. The IRS is looking to grow its workforce by about 14% between now and 2029, tapping into some of the $60 billion it got to modernize under the Inflation Reduction Act. But the agency is asking Congress to bump that funding up to $104 billion that it would have to spend through the next decade. IRS Commissioner Danny Werfel said that if those funds run out, the agency will not be able to sustain its growing workforce. “Either you don't replace people that retire, you furlough, and a last resort, you RIF," Werfel said, referring to a reduction in force. "Those are the realities that could happen.”
  • The Office of Personnel Management is rethinking the job skills needed for more than 40,000 human resources employees. Across all agencies, OPM has created new competency models for HR positions. Those models cover all HR management work, as well as more specialized skills. It is part of a broader effort to address strategic human capital management, while emphasizing skills-based hiring. Many of the newly defined skills, like decision-making and teamwork, emphasize hands-on qualifications.
  • The Defense Innovation Unit is looking for new ways to track down cyber adversaries who might already be inside DoD networks. DIU is shopping for what it calls a "hunt kit," which must be able to function without any internet connection. It also cannot rely on any additional resources from a partner’s on-premise infrastructure. The hunt kit must be able to fit in a carry-on bag and it has to meet weight and dimension limitations of international commercial flights. The vendor has to complete a prototype hunt kit for government testing within four months of receiving an Other Transaction award. Responses are due by June 14.
    (DIU seeking joint cyber hunt kit solutions - Defense Innovation Unit)
  • The IRS is letting some of its employees keep working remotely until the start of 2025. IRS planned to end its remote work pilot program in June, but now it will keep it running until January, to continue gathering more feedback and data. The IRS will not add more employees to the pilot program, but employees already in it can choose to opt out. A Treasury Department assessment found jobs advertising remote work led to the most hires, and that retention and engagement scores remained stable. The IRS is already meeting the Biden administration’s requirement to have federal employees working in the office about 50% of the time.
  • Vendors providing technology products to the government now have a better sense of how much of their product agencies are buying, and who is buying it, through the schedules program. The General Services Administration expanded the "demand data" program to vendors who provide technology products like laptops or software licenses. GSA said through demand data, contractors can customize their price list to those items that customers are more likely to buy. GSA launched the demand data effort in January 2023 for the general supplies and services schedule and found a relatively small number of products generated 50% of overall sales.
  • The federal HR workforce may soon see more support from the Office of Personnel Management. OPM is making plans to launch an HR career growth website this fall. The online platform will be a way for federal HR practitioners to access information and interact as a community. In the meantime, OPM is currently piloting an HR career pathing model at nine agencies. The end goal is to encourage better retention of HR employees, and help them grow in their careers.
    (Upcoming launch of HR career growth platform - Office of Personnel Management)
  • Sasha Baker, the acting under secretary of Defense for policy, officially stepped down from her post last week. She has been serving in that role since 2023, when Colin Kahl left the position. Amanda Dory, the director of the Africa Center for Strategic Studies at the National Defense University, will temporarily step into Baker’s role. Last year, President Joe Biden nominated Derek Chollet to be the Pentagon’s policy chief and renominated him earlier this year, because the nomination has been stalled in the Senate.

The post DISA’s new five-year plan will consolidate support for the warfighter first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-newscast/2024/05/disas-new-five-year-plan-will-consolidate-support-for-the-warfighter/feed/ 0
Beyond modernization: The cloud is a secure platform for mission innovation https://federalnewsnetwork.com/federal-insights/2024/04/beyond-modernization-the-cloud-is-a-secure-platform-for-mission-innovation/ https://federalnewsnetwork.com/federal-insights/2024/04/beyond-modernization-the-cloud-is-a-secure-platform-for-mission-innovation/#respond Mon, 29 Apr 2024 16:56:57 +0000 https://federalnewsnetwork.com/?p=4980988 Classified cloud enables agencies to redefine essential mission workflows.

The post Beyond modernization: The cloud is a secure platform for mission innovation first appeared on Federal News Network.

]]>
This content was provided by Microsoft.

Modernizing federal IT systems to support mission demands should accomplish more than just moving apps and data to the cloud. It is time to reframe the basic question about cloud modernization, especially for classified workloads. Instead of asking, “Can this mission task run in the cloud?” consider asking, “What can the cloud enable us to do that we could never do before?”

From powering collaboration and decision-making at the edge to taking advantage of generative AI (GenAI) and preparing for quantum computing, classified cloud enables agencies to redefine essential mission workflows.

Still, it’s crucial to avoid implementing a solution in search of a problem. Mission owners and technologists need to collaborate and agree on solution applications, while governance and culture ensure the cloud empowers critical workflows. Agencies need to work closely with policymakers to ensure that new capabilities keep pace with ever-changing conditions.

Success also requires industry partners who provide the tools, expertise, and experience to help create a platform for change. Most of all, experimentation is needed to truly see the impact of the cloud on mission workflows. Here is where to begin.

Empowering innovation with AI

Classified cloud’s security, agility, and capability activate new ways to deliver on mission priorities. Cloud platforms allow new, game-changing technologies to flourish, including GenAI, which delivers intuitive automation that supports, not supplants, human decision-makers and operators.

By shifting tedious, data-intensive tasks—including summarization, analysis, code development and more—to AI-powered copilots, users gain the freedom to find new ways to solve problems.

GenAI is most often delivered through cloud platforms due to the cost and complexity of building and maintaining these services locally. This as-a-service model delivers efficiency and performance within a managed, secure environment.

Where imagination meets implementation

Cloud-native services empower users to create vastly more efficient mission workflows using AI-powered solutions. Logistics teams, for example, can easily create secure, connected processes to support:

  • Real-time data integration and analysis. AI-powered predictive analytics enables more accurate ordering and transportation planning, reducing waste and improving responsiveness.
  • Better collaboration. Centralized access to data and applications enables faster, more informed supply chain planning to meet changing conditions.
  • Security and compliance. Some classified cloud platforms, such as Azure Government Secret, minimize risk by supporting security up to DoD Impact Level 6 (IL6) and Director of National Intelligence (DNI) Intelligence Community Directive (ICD 503) accreditation with facilities at ICD 705. Cloud-native encryption tools and AI-based cybersecurity apps protect data from unauthorized use, while identity and access capabilities can be easily managed across the entire ecosystem.

Using the cloud to create enhanced mission workflows starts with a desired outcome, such as faster intelligence analysis and distribution. Agencies can define “what if?” scenarios and then use the full scope of cloud capabilities to investigate new ways to support the mission.

The essential elements for innovating in the cloud

The cloud makes it faster and easier to stand up testbeds to identify future challenges and solutions, from battling emerging threats to readying the organization for quantum computing. Preparing for a cloud-centric operating model takes a combination of elements, including:

  • Understanding mission problems that require a solution. Just because technology can be applied to a mission does not mean the use case reflects reality. Mission owners and technologists need to agree on the benefits of a solution from end to end: Does it solve a mission problem, and can it be supported by IT?
  • Effective governance and usage policies. Policymakers should be informed and educated on technological advancements to understand their capabilities and limitations. It is just as important to explain how innovative technology enhances mission capabilities. This clarity helps leaders develop appropriate guidelines for cloud usage, which helps prevent roadblocks to innovation.
  • Imagination and experimentation. Investigating new use cases in a pilot program or exercise is critical for assessing a technology’s effectiveness in overcoming mission challenges. This approach limits the risk of investing in “nice to have” solutions—or solutions that do not actually solve a problem.

An additional ingredient is the right technology partner, whose understanding of the mission is as important as delivering the appropriate technology.

Commercial cloud providers can innovate at scale and speed, often outpacing mission implementations. A trusted technology partner provides insights on how to use new cloud capabilities to move the mission forward. Their recommendations come from experience with both technology and the ways to achieve mission goals—essential elements for success.

Driving toward transformation

Beyond its exceptional performance, reliability, scalability, and speed, classified cloud’s security enables agencies to innovate with confidence. Classified cloud stands ready to support emerging technologies and mission objectives. With the support of a trusted partner, agencies can go beyond modernizing to truly transforming operations and accelerating mission priorities.

Learn how the cloud empowers agencies to innovate and activate new workflows at mission speed, quickly and securely. Read more: GenAI for US Federal Government (microsoft.com).

The post Beyond modernization: The cloud is a secure platform for mission innovation first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/beyond-modernization-the-cloud-is-a-secure-platform-for-mission-innovation/feed/ 0
StateRAMP Exchange 2024: N.H.’s Ken Weeks, Maine’s Charles Rote on value of ‘do once, use many’ approach https://federalnewsnetwork.com/federal-insights/2024/04/stateramp-exchange-2024-n-h-s-ken-weeks-maines-charles-rote-on-value-of-do-once-use-many-approach/ https://federalnewsnetwork.com/federal-insights/2024/04/stateramp-exchange-2024-n-h-s-ken-weeks-maines-charles-rote-on-value-of-do-once-use-many-approach/#respond Thu, 25 Apr 2024 20:48:37 +0000 https://federalnewsnetwork.com/?p=4977553 The New Hampshire and Maine cyber leaders say using the StateRAMP cloud security shared service accelerates their digital transformation efforts.

The post StateRAMP Exchange 2024: N.H.’s Ken Weeks, Maine’s Charles Rote on value of ‘do once, use many’ approach first appeared on Federal News Network.

]]>

When Ken Weeks became the chief information security officer for the state of New Hampshire in 2022, he considered setting up a cloud security program just to serve his state.

Around the same time, Charles Rote, deputy CISO for the state of Maine, recognized the need to validate and verify the security of cloud services but quickly realized he had a people problem. Maine, like many public and private sector organizations, just couldn’t hire enough cybersecurity experts.

Weeks, meanwhile, had his New Hampshire team of three look into what it would take to set up an oversight security program.

What both Rote and Weeks quickly grasped, the “do once, use many” mantra became the fastest road to modernizing their states’ aging technology infrastructures and applications.

“Having a repeatable process, through StateRAMP or FedRAMP that aligns with the standards that our policies and procedures are written to anyway, has been a lifesaver for me,” Weeks said during the Federal News Network StateRAMP Exchange 2024.

“In New Hampshire, for example, we’re fortunate that we looked into the crystal ball and guessed properly and aligned our policies and procedures with the same set of National Institute of Standards and Technology standards that StateRAMP currently operates under. By doing that, when we select a vendor or give a preference to a vendor who’s gone through the StateRAMP process, we know that not only have they met the same security standards and controls that we impose upon ourselves at the state, but they’ve been verified by somebody else.”

It means the state no longer has to take a vendor’s word for the validity of cybersecurity statements, Weeks said. “We have a trusted third party, both from the auditing documents that are submitted to StateRAMP as well as the StateRAMP program management office that’s reviewing all those in great detail before we select the vendor and start doing business with them.”

Maine’s workforce challenges

StateRAMP, the cloud security program modeled after the federal cloud security program FedRAMP, launched in 2021 to provide a shared service model for best practices and standardization in cloud security verification.

Rote said the duals challenge of hiring cyber talent and maintaining an aging technology stack drove the decision to lean into StateRAMP.

“By transitioning to a cloud environment, oftentimes if you have the right partner, it causes you to be more disciplined in how you manage that environment. Applications can’t be exceptionally old to operate in these cloud environments, so that pushes some of the application development and our infrastructure folks to be more modern, more current and maintain the security at an appropriate level,” Rote said.

“The importance of the security assurances is we only select certain assurances for those most critical datasets or those most critical business functions. By selecting only the best of those credentialing requirements, it alleviates a lot of burden on our information security office, which may not have the appropriate personnel and the appropriate time to do the assurances. But also, oftentimes, if you’re not leveraging these third-party audits and results, the solutions can be a bit of a black box too. You’re not necessarily doing your due diligence to make sure that what you’re putting in that environment is truly safe.”

For Maine, New Hampshire and likely many other state and local organizations, having that confidence in their chosen cloud service providers removes one level of complexity as they continue on their digital transformation journeys.

Reducing the data center footprint

Weeks said, for example, New Hampshire recently accelerated its move of applications and workloads to the cloud.

“It has a lot to do with maintaining data centers and keeping data centers modern — and the expense that goes into that from a power, space and cooling standpoint. It’s also having enough staff to do all of the necessary cyber hygiene things, security updates, application updates and so on,” he said. “The idea is to host as little on premise and have as few remaining custom applications as we can. We’re looking for more and more commercial products that can be hosted as a managed service, both from a cloud hosting environment as well as the applications themselves.”

Weeks said the reliance on StateRAMP also has made it easier to contract with cloud service providers, which then speeds up digital transformation.

If a CSP is StateRAMP-approved, New Hampshire accepts the certification and focuses on the other aspects of the contract such as terms and conditions. But Weeks warned that while having a StateRAMP certification is reassuring, it’s still incumbent on his team to understand all the risks associated with every product the state acquires and implements.

“It’s important to point out that just because a product from a company is StateRAMP- or FedRAMP-certified, that doesn’t mean the company is certified, it’s just that product,” he said. “You have to make that distinction, and you also have to make a distinction between the product and the hosting environment. Because you can take a certified product and host it at Joe’s Chicken Shack and Web Hosting, and you’ve kind of defeated the purpose of having a StateRAMP-certified product.”

Maine is using the confidence in the CSPs to move toward a single enterprise. Rote said each of the CSPs becomes a part of a larger network.

“There has to be an inherent level of trust associated with that. We were lucky too as we saw the tea leaves and the writing on the wall that NIST standards were the way to go. Utilizing cloud service providers at the various impact levels translates well and enables us to deploy capability at an efficient rate with the right partners,” he said. “The great thing about leveraging FedRAMP for anything that’s holding our federal regulatory compliance requirements is that the federal government approaches certifications of clouds services in a similar fashion to the way that we can explain being accepting of the StateRAMP certifications. So it expedites our ability to migrate data from the federal government that’s been shared with us. That is essential to us conducting our businesses.”

He added that when using a CSP that meets FedRAMP requirements, the different federal agencies the state deals with are more likely to accept that single security baseline, typically at the FedRAMP Moderate certification.

“However, when it’s on premise and you’re dealing with each one of these federal entities in isolation, they tend to pile on all these extra requirements on the state in their compliance regimes, and it can be problematic and it’s a significant burden,” Rote said.

Discover more tips and tactics shared during the Federal News Network StateRAMP Exchange 2024.

The post StateRAMP Exchange 2024: N.H.’s Ken Weeks, Maine’s Charles Rote on value of ‘do once, use many’ approach first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/stateramp-exchange-2024-n-h-s-ken-weeks-maines-charles-rote-on-value-of-do-once-use-many-approach/feed/ 0
StateRAMP Exchange 2024 : NASCIO’s Robinson, Weaver on trusting cloud services https://federalnewsnetwork.com/federal-insights/2024/04/2024-stateramp-exchange-nascios-robinson-weaver-on-trusting-cloud-services/ https://federalnewsnetwork.com/federal-insights/2024/04/2024-stateramp-exchange-nascios-robinson-weaver-on-trusting-cloud-services/#respond Wed, 24 Apr 2024 14:15:14 +0000 https://federalnewsnetwork.com/?p=4974978 State chief information officers say cloud services provide as good or better security than on-premise infrastructure, NASCIO research finds.

The post StateRAMP Exchange 2024 : NASCIO’s Robinson, Weaver on trusting cloud services first appeared on Federal News Network.

]]>

In the span of about a decade, state chief information officers flipped their script on cloud computing.

In the 2023 State CIO Survey, state CIOs overwhelmingly identified security as the most important benefit of cloud computing.

Doug Robinson, executive director of the National Association of State CIOs, said a decade ago the survey results were quite different.

Back then, “we asked the question: ‘What’s the major impediment or barrier to broader cloud adoption?’ The number one answer was, ‘We’re concerned about security,’ ” Robinson said during the Federal News Network StateRAMP Exchange 2024.

“We’ve seen this huge shift about their concern. Part of that was the fact that 87% said that they agreed that cybersecurity offered by third-party cloud providers is either on par or better than the security measures in place in their state government. That was a huge shift in the understanding and recognition, in addition to the investments that many of the cloud service providers put into their platforms in complying with StateRAMP — but also the fact that they had to meet the compliance of the various other federal regulatory requirements.”

Now, state CIOs say security has become the cloud characteristic that matters the most. Robinson said the shift is one major reason why StateRAMP, the state version of the Federal Risk Authorization and Management Program (FedRAMP) cloud security initiative, has taken hold and is seen as valuable.

“We saw accelerated cloud adoption during the pandemic because there was a compelling need — because states needed a speed to solution market adjustment. They had governors and other public officials saying, ‘We need to deliver this solution to citizens very quickly and that we need to scale it to millions of citizens very quickly. And we need to do this in a cost effective manner,’ ” Robinson said. “Now that the states are definitely invested in a cloud solutions, whether it’s on premise or off, they need the confidence that the CSPs meet the security demands, and StateRAMP provides that as an independent, neutral, third-party arbitrator. It addresses a lot of the constraints that the states currently have to deal with.”

Pandemic cloud efforts laid groundwork for what’s to come

Jim Weaver, president of NASCIO and CIO for the state of North Carolina, said he experienced this firsthand during the pandemic.

Weaver, who was the state of Washington’s CIO during the pandemic, said the emergency drove innovation and created an opportunity to change the way state and local agencies did business. As a result, it led to states taking advantage of secure cloud services, he said

“What we learned was how tangled our architecture truly was. As much as we thought we understood our architecture, we did not realize the spaghetti mess. I can recall as we were trying to move our disease reporting system, it was not an application. It was about 30 applications that had to move in tangent. That was eye opening to many, but we got it done,” Weaver said. “Now that they have the ability to be more agile and to be able to pivot in different directions, the flexibility that was provided to the business via the cloud is really going to be the game changer moving forward.”

The foundation of secure cloud services means state and local governments also can better prepare for new capabilities coming from artificial intelligence, specifically generative AI, and advanced analytics.

Weaver said these tools have to be flexible enough to support the business of government, and the only way to do that is through secure cloud services.

The confidence in CSPs being secure came initially from the FedRAMP certification and now from StateRAMP. Weaver said states moved applications to the cloud at an accelerated pace thanks to these assurances.

“Having vendors go through the StateRAMP process and get certified in that regard was very beneficial for us. We never looked to do our own thing. We’re very much looking at our partners in StateRAMP and leveraging what they have to offer when we look at the vendor community and who has gone through that certification process,” Weaver said. “When it comes to security, at the end of day, a bad app — whether hosted on premise or hosted in the cloud — is still vulnerable. I think it’s incumbent upon us to partner with the right vendor, who can understand our business, integrates with us very nicely and helps drive us along.”

Spending on better outcomes for citizens

The other big benefit state CIOs are seeing from taking advantage of StateRAMP is that it lets them focus their digital transformation efforts on business capabilities instead of technology tools.

Robinson said CIOs see the cloud no longer as a top priority but as an expectation of how they will modernize.

“Cloud is clearly a significant part of their modernization efforts, along with several others. But if you look at cloud adoption, which is now certainly extremely high, all states are doing something in cloud — either private, on-premise or, in most cases, third-party, off-premise cloud services,” he said. “What you see is the coupling of that with a number of other opportunities, may be things like looking at their data center footprint and making cases for closing down physical data centers so they can move to off-premise cloud solutions.”

Even so, states still have significant technical debt, Robinson said. “Our other modernization and application modernization study showed that 50% of the applications that are currently residing in state government and being used would be considered a legacy environment.”

Weaver added that if North Carolina can spend $1 on a program instead of on an IT system, the better it is for the citizens.

“When I talk about digital transformation, it really encompasses connectivity, which is essentially broadband. You have the cyber component, you have the privacy component and then you have the legacy modernization component, which is basically saying transition to the cloud. What stops us a lot of times from being able to enable digital transformation opportunities is the back end systems that are there, basically at the forefront of supporting constituent services,” he said. “We’re all coming to a point in time now from a capital aspect, we’re seeing the unnecessary investments that need to go back into a facility. And that’s probably not the best use of those dollars, when those dollars can get redirected to other modernization opportunities to get us to a better outcome for the citizens we serve.”

Discover more tips and tactics shared during the Federal News Network StateRAMP Exchange 2024.

The post StateRAMP Exchange 2024 : NASCIO’s Robinson, Weaver on trusting cloud services first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/2024-stateramp-exchange-nascios-robinson-weaver-on-trusting-cloud-services/feed/ 0
Cloud native in the government: Challenges and opportunities https://federalnewsnetwork.com/federal-insights/2024/04/cloud-native-in-the-government-challenges-and-opportunities/ https://federalnewsnetwork.com/federal-insights/2024/04/cloud-native-in-the-government-challenges-and-opportunities/#respond Mon, 22 Apr 2024 18:52:32 +0000 https://federalnewsnetwork.com/?p=4972293 Federal agencies can overcome challenges around Kubernetes, and even open up new opportunities, with the right strategies.

The post Cloud native in the government: Challenges and opportunities first appeared on Federal News Network.

]]>
As federal agencies continue to adopt new methods of digital innovation to move faster, they can add a significant amount of complexity. Applications have to work the same way on-premises, in whatever public clouds they’ve adopted, and more increasingly, at the edge. Containerization is making that possible, and while Kubernetes has emerged as the default infrastructure for container orchestration across all these environments, that comes with its own challenges as well.

In Spectro Cloud’s independent research report, 2023 State of Production Kubernetes, they found that 98% of respondents encountered challenges using Kubernetes in production. Further, 75% suffered issues caused by interoperability between software elements running in their clusters. That’s up from 66% in 2022.

“The challenge for IT teams that have to manage Kubernetes infrastructure is that they don’t just manage Kubernetes, but a specific ‘stack’ of cloud native software required for those applications to run”, says Mark Shayda, senior solutions architect at Spectro Cloud. “And those skills are not easy to find, especially in the public sector, which extrapolates the challenge.”

Indeed – 40% of respondents said they lack the skills or headcount to manage Kubernetes infrastructure, up from 36% in 2022.

Another dimension to this is technology debt: Most of the government’s application workloads are currently hosted on virtual machines (VMs).

“In the public sector it’s been virtual machines since the early 2000s. That was a shift from large monolithic applications running on individual servers to running them in VMs,” Shayda said.

Finally, there’s the challenge of edge. Among certain agencies especially, endpoints in the field have proliferated massively, from drones and sensors to Raspberry Pis and military field kits.

Deploying and especially managing edge devices and applications is not easy, especially when it has to be easy enough for users in the field to understand them – think of a warfighter on the ground that is carrying a backpack that runs an edge app. And of course, security is necessary from the physical device to the application.

Overcoming the challenges 

How can the public sector overcome those challenges? Establishing processes and tools that ensure that every Kubernetes cluster is not a “snowflake” is the only way to scale, especially when thinking about deploying applications to more challenging environments such as the edge.

A key first requirement is building a strategy around repeatability, with the end goal being to centrally manage and orchestrate complete Kubernetes “stacks” at scale, purposely-built for each individual application use case. “The industry has already acknowledged the need for a simplified way to manage Kubernetes infrastructure in a ‘declarative’ manner with projects like CNCF’s Cluster API,” says William Crum, software engineer at Spectro Cloud. “This means being able to prescribe how the environments’ desired state should look like, similar to how Kubernetes itself works with containers in a declarative way. The key requirement is to establish a mechanism for defining “blueprints” of complete stacks that includes all the necessary software elements for applications to work, and then centrally managing them across any location”.

Repeatability also provides consistency across the lifecycle, making it easier for agency employees in the Defense Department and warfighters to navigate the complexity of cloud native infrastructure and Kubernetes. The right management platform can enable the creation of known-good configurations of clusters and can make deployment easier, ensuring that all the environments remain consistent over time. This is important for mission-critical infrastructure. Having the right tools can allow users to perform common tasks in intuitive ways by using graphical user interfaces, or automating complex tasks like onboarding new edge hosts simplifies technical processes for warfighters.

“I think that specifically provides value for not only service members but the Defense organization as a whole,” said Crum. “A lot of this complexity can be abstracted with very easy to use and simple user interfaces where I don’t have to teach Marines or soldiers how to understand all the intricacies of Kubernetes. I can simply just show them the user interface and how that represents the architecture.”

“To me, money and saving lives, in the end, is really what it’s all about. How can we do things better, cheaper, faster, and to where, in the end, we’re saving warfighters lives?” added Sheyda.

When it comes to legacy workloads running on existing VMs, the always-maturing cloud native ecosystem around Kubernetes can also provide opportunities for more efficiency through consolidation, especially after the turmoil that last year’s Broadcom acquisition of VMware has caused in the industry.

“Virtual machines work; they’ve worked for 20 years,” said Crum. “Sometimes that leads technology experts at federal agencies to adopt an ‘if it’s not broke, don’t fix it’ mindset. But now more than before, they should be asking whether they can improve on current dynamics. Doing so can allow them to reduce costs and focus more resources on their missions, especially national security.”

If an agency is running both VMs and Kubernetes, they have two parallel environments and platforms managed by different teams with different skill sets. It may not be possible to completely eliminate VMs, as some workloads still require them due to their stability and maturity. But some of those workloads can be brought into Kubernetes clusters with one unified platform to manage, so that the same policies, controls and management practices can be applied to them.

“That can improve both governance and efficiency, ultimately speeding up application innovation” said Crum.

The post Cloud native in the government: Challenges and opportunities first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/cloud-native-in-the-government-challenges-and-opportunities/feed/ 0
Dissecting the federal cloud ‘forecast’ https://federalnewsnetwork.com/federal-insights/2024/04/dissecting-the-federal-cloud-forecast/ https://federalnewsnetwork.com/federal-insights/2024/04/dissecting-the-federal-cloud-forecast/#respond Mon, 22 Apr 2024 18:30:15 +0000 https://federalnewsnetwork.com/?p=4972265 The new capabilities, applications and business models unlocked by cloud adoption are expected to have an outsized effect on the business of government.

The post Dissecting the federal cloud ‘forecast’ first appeared on Federal News Network.

]]>
Cloud service adoption among federal agencies has changed significantly since it first kicked off more than a decade ago. Cost-savings and enhanced capabilities, while certainly still motivators, have given way to more targeted incentives. Cloud services, for example, provide collaborative tools that enable hybrid work environments. They facilitate data collection that can help agencies make better decisions about everything from workplace efficiency to cybersecurity. Finally, they’ve helped federal agencies hone in on customer experience, delivering more efficient and personalized services for constituents.

Challenges of cloud adoption

Of course, there have been challenges along the way; data sovereignty, transportation and storage have mostly been worked out at this point, especially through the introduction of government-specific and even classified cloud instances.

However, service-level agreements have been a weak point. On the latest Federal Information Technology Acquisition Reform Act scorecard, most agencies received failing grades when it came to meeting the Office of Management and Budget’s requirements for the Federal Cloud Computing Strategy. The GAO report backs that up, noting “specifically, agencies’ service level agreements did not consistently define performance metrics, including how they would be measured, and the enforcement mechanisms.”

Security has been particularly challenging, as well. The proliferation of online threats from bad actors has made manual monitoring nearly impossible, putting data and network infrastructure at risk. For example, a 2023 Government Accountability Office report found that of 15 systems across four departments examined, the departments had only implemented continuous monitoring in three of those systems. However, artificial intelligence may present the solution to this particular struggle.

The federal cloud “forecast”

AI is one of the emerging technologies expected to have a major impact in the near future, enabled by cloud services. In a survey, Gartner found that 87% of organizations expect to adopt AI by 2025, and federal agencies are no exception. Currently, agencies are at varying points in figuring out how to integrate AI into their existing processes. Developing strategies for responsible and ethical AI that emphasizes accountability, transparency and fairness, is another hurdle agencies will have to clear before they can fully adopt the technology.

However, once they do, AI – including generative AI and large language models – can be trained on the massive amounts of security data agencies have been collecting thanks to their cloud investments, making continuous monitoring not only easier, but cheaper and more effective as well. AI-enhanced tools can also be used to help shift security left, making development processes more secure.

Further, generative AI can help agencies speed up their content generation and deliver new applications and services. AI-enhanced analytics tools can improve decision-making based on data. Large language models can help deliver better customer experience.

Agencies can also leverage automation through cloud service adoption, making employees more effective by reducing the number of manual tasks they are required to perform. It will save time, as employees will be required to do less data entry between disparate systems and tasks. Automation of manual tasks can also help reduce the effect of human error, yielding cleaner data for better decision making.

Cloud adoption is also making edge computing more common, enabling agencies to collect, store and process data in the field for faster and better decision making. That will lead to better outcomes in time-sensitive fields like law enforcement and military applications, as well as allowing scientists and researchers to process more of their data in the field. In addition, edge computing contributes to both network security and cost effectiveness by reducing the need for transference of large amounts of data.

And cloud services themselves are expected to grow significantly as agencies turn to more as-a-service offerings to help them accomplish their missions. Gartner expects the global public cloud services market to grow to $501.3 billion in 2024, up from $364.1 billion in 2022. This will likely be driven by increased adoption of multi-cloud strategies, which will help agencies reduce vendor lock-in, improve their flexibility and resilience, and optimize performance by choosing the best tool for the need, rather than relying on a single vendor for all their cloud services.

This will also drive an increase in interoperability and portability, allowing agencies to transfer data and workloads from one platform and environment to another, provider notwithstanding. That flexibility will allow agencies to further optimize costs and efficiencies through strategic planning and management.

Finally, cloud-managed networking solutions will give agencies increased visibility across these platforms, environments and services. Combined with the government wide push to adopt zero trust security, agencies will be better equipped to protect their own networks and data, as well as the data of their customers.

For all of these reasons and more, the new capabilities, applications and business models unlocked by increased cloud adoption are expected to have an outsized effect on the business of government within the next few years.

The post Dissecting the federal cloud ‘forecast’ first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/dissecting-the-federal-cloud-forecast/feed/ 0
Federal Executive Forum Secure Cloud Computing Strategies in Government Progress and Best Practices 2024 https://federalnewsnetwork.com/cme-event/federal-executive-forum/federal-executive-forum-secure-cloud-computing-strategies-in-government-progress-and-best-practices-2024/ Mon, 22 Apr 2024 15:41:22 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4971908 As cloud technology use cases become increasingly complex, how can agencies ensure data is secured?

The post Federal Executive Forum Secure Cloud Computing Strategies in Government Progress and Best Practices 2024 first appeared on Federal News Network.

]]>
var config_5010943 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB5850936029.mp3?updated=1716374714"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedExeFor1500-150x150.jpg","title":"Secure Cloud Computing Strategies in Government Progress and Best Practices 2024","description":"[hbidcpodcast podcastid='5010943']nnCloud technology continues to play an integral role in driving innovation at federal agencies. As use cases become increasingly complex, how can agencies ensure data is secured?nnDuring this webinar, you will gain the unique perspective of top federal IT experts:n<ul>n \t<li><a href="https:\/\/www.linkedin.com\/in\/katherine-ebersole-b9435311\/" target="_blank" rel="noopener noreferrer"><strong>Katherine Ebersole<\/strong><\/a>, Acting Deputy Chief Information Officer, Disaster Operations, Federal Emergency Management Agency<\/li>n \t<li><strong><a href="https:\/\/www.linkedin.com\/in\/allen-hill-7582588\/" target="_blank" rel="noopener noreferrer">Allen Hill<\/a><\/strong>, Chief Information Officer, Federal Communications Commission<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/eric-mill-konklone\/" target="_blank" rel="noopener noreferrer"><strong>Eric Mill<\/strong><\/a>, Executive Director for Cloud Strategy, General Services Administration<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/biancalankford\/" target="_blank" rel="noopener noreferrer"><strong>Bianca Lankford<\/strong><\/a>, Vice President, Security and Reliability Engineering, Datadog<\/li>n \t<li><strong><u>Brian Schoepfle<\/u><\/strong>, Head of Public Sector Partner Engineering, Google<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/steve-faehl\/" target="_blank" rel="noopener noreferrer"><strong>Steve Faehl<\/strong><\/a>, Federal Security Chief Technology Officer, Microsoft<\/li>n \t<li><strong>Moderator: Luke McCormack,\u00a0<\/strong>Host of the Federal Executive Forum<\/li>n<\/ul>nPanelists also will share lessons learned, challenges and solutions, and a vision for the future."}};

Cloud technology continues to play an integral role in driving innovation at federal agencies. As use cases become increasingly complex, how can agencies ensure data is secured?

During this webinar, you will gain the unique perspective of top federal IT experts:

  • Katherine Ebersole, Acting Deputy Chief Information Officer, Disaster Operations, Federal Emergency Management Agency
  • Allen Hill, Chief Information Officer, Federal Communications Commission
  • Eric Mill, Executive Director for Cloud Strategy, General Services Administration
  • Bianca Lankford, Vice President, Security and Reliability Engineering, Datadog
  • Brian Schoepfle, Head of Public Sector Partner Engineering, Google
  • Steve Faehl, Federal Security Chief Technology Officer, Microsoft
  • Moderator: Luke McCormack, Host of the Federal Executive Forum

Panelists also will share lessons learned, challenges and solutions, and a vision for the future.

The post Federal Executive Forum Secure Cloud Computing Strategies in Government Progress and Best Practices 2024 first appeared on Federal News Network.

]]>