Government Technology Top News

Subscribe to Government Technology Top News feed
Updated: 11 hours 47 min ago

Michigan Gaming Board Will Integrate 19 Legacy Systems in Unisys Online Solutions

April 26, 2017 - 08:00

The Michigan Gaming Control Board (MGCB), which regulates Detroit casinos, horse racing and “millionaire party” charitable gaming, and oversees Native American casinos’ compliance with state-tribal provisions, will integrate data from 19 legacy systems in a new automated platform with cloud-based storage.

Global IT services company Unisys, which does business with U.S. agencies including the federal Transportation Security Administration and the states of Minnesota and Pennsylvania, announced the five-year contract worth an estimated $9.3 million with Michigan’s Department of Technology, Management and Budget (MDTMB) on Wednesday, April 26.

A state contract summary describes it as being awarded from an RFP “to establish an enterprise IT system for MGCB to replace the legacy applications [it] uses to license and regulate its various industries” and “move to one agencywide solution for all data and file management needs for mission-based business processes.”

Caleb Buhs, MDTMB public information officer, confirmed via email the contract was approved in February by the State Administrative Board and has been “officially executed.”

The exact rollout is unclear, but its solutions should represent a dramatic change to operations at MGCB, which over the past 20 years has seen its responsibilities expand significantly. Each of its areas of focus “seemed to have its own system which made regulatory compliance very burdensome. Even when they went to e-government solutions it was very ‘paperwork-online,’” Mark Forman, Unisys’ global head of public sector, told Government Technology.

In a statement, MGCB Executive Director Richard Kalm said his agency “has worked diligently to improve internal processes, but multiple legacy systems limit our progress.”

“This is great news for the gaming and horse racing businesses and charities we license and regulate and the citizens of Michigan we are sworn to protect,” Kalm said. Through a communications specialist, he declined an interview with Government Technology.

Unisys has worked with the Secretary of State’s office on vehicle and driver licensing, and supplied Michigan with its Statewide Automated Child Welfare Information System. The state identified it as a potential vendor through market research, Forman said.

It will “create a modern, collaborative operating environment” for the agency and digitize its records and documents, making it easier to find and share information internally and to the public, Unisys said in its news release.

That includes migrating data to the AMANDA Case Management and Process Automation platform from Canada-based contract partner CSDC Systems and through the Microsoft Azure government cloud.

MGCB will also utilize Unisys’ InfoImage enterprise content management software to streamline workflows and centralize data architecture originally tailored into 19 separate, dated systems.

“By using InfoImage and an enterprise content management approach, we abstract out that data architecture problem and we can then take advantage of the content,” Forman said. “It literally is the framework by which you cut across these various siloed approaches.”

The solution’s optical character recognition ability will also update document scanning for the agency, letting employees in the field photograph documents on their cellphones, then upload them to case management files digitally as text.

The extent to which the contract will transform MGCB’s workflow and processes is unknown, but Forman said integrating 19 systems into one platform could represent a “huge” savings in time and money.

“I think the easiest way to imagine this is when you have new people coming into a workforce and they have to learn all these different systems, as opposed to something that is more intuitive,” he said.

Vermont Governor Creates Agency of Digital Services, Appoints CIO

April 26, 2017 - 06:00

On April 17, IT in Vermont officially shifted, when a new Agency of Digital Services (ADS) took the place of the Department of Information and Innovation. The goal? To refresh that state’s efforts in modernization. This move is part of Gov. Phil Scott's strategy to manage the state’s IT infrastructure more effectively.

And on April 25, Scott announced the appointment of John Quinn as the state's new CIO and Secretary of Digital Services. Quinn replaces Darwin Thompson, who took over as CIO and commissioner of the Department of Information and Innovation after Richard Boes departed in January. Boes served as Vermont's CIO for five-and-a-half years under former Gov. Peter Shumlin.

Quinn has been serving as the chief innovation officer in the previous state IT department since January of this year. His career in information technology with the state goes all the way back to 2001.

"John has been integral in the planning of the Agency of Digital Services and in our overall strategy for modernizing state government," Scott said in a press release. "His leadership and experience in IT and project management will continue to be incredibly valuable as we establish this agency and take a more coordinated and accountable approach to managing the state's IT infrastructure."

Quinn’s focus in his new role will be to “take a more coordinated and accountable approach to managing the state’s IT infrastructure.” The goal is to provide more support to the state employees so that custom service can be improved and projects can rise above the challenges that face government IT systems.

How California's Public Utility Commission is Using Mobile to Bolster its Workforce

April 26, 2017 - 05:00

SACRAMENTO, Calif. — Limited funding and resources isn’t stopping the California Public Utilities Commission (CPUC) from looking to technology to close the gap between public need and its services.

Following an industry briefing hosted by Techwire* on April 25, CPUC CIO Reza Yazdi discussed some of the potential he sees in magnifying his workforce through mobile technology.

In the larger public safety sense, Yazdi said the smartphone, coupled with a CPUC reporting app, could be the difference between addressing utility issues quickly and experiencing a public safety emergency.

“Mobile reporting, as you know, everybody has a mobile device these days and we would like to make sure that these mobile devices can help us to provide the type of service that the public is expecting … to receive,” the CIO explained. “Basically, then, it expands our workforce to millions of people.”

As it stands, Yazdi said, the mobile project could put the power to spot problems throughout the expansive public utility network in the hands of potentially millions of Californians.

Though almost every state agency could make the argument for more resources, the CIO said utilities, like electricity and gas, could potentially pose a public safety risk if not addressed quickly.

Rather than relying solely on the occasional phone-based report — which are limited by residents knowing or taking the time to find the correct number — Yazdi said a mobile solution could allow for instant reporting, coupled with other useful tools like geolocation and photos.

“We have a limitation in terms of the number of investigators or people working for CPUC," Yazdi said. "Adding a mobile device, it will expand our hands from hundreds to millions of people. Then we can collect all of the information and try, based on the information we receive, [to] offer better services, especially in the public safety area.”

*Techwire is a publication of e.Republic, Government Technology's parent company.

New York Names IBM Veteran Robert Samson as CIO

April 25, 2017 - 15:30

Less than a month after the departure of Maggie Miller, the state of New York has found its new CIO in 36-year IBM leader Robert Samson. The announcement was made Tuesday, April 25, as part of a series of appointments by Gov. Andrew Cuomo.

Samson is taking over the state's IT operations after retiring from IBM in 2009. He occupied several leadership positions at the company, including stints as general manager of its Global Public Sector business and as vice president of its Worldwide Systems and Technology Group.

Samson sits on the advisory board of the State University of New York Center for Technology in Government and participates in the state's Spending and Government Efficiency Commission, which was involved in the state's decision to move email to the cloud.

Miller left the Office of Information Technology Services March 31 after serving as the state CIO for two years.

Alaska Moves to Consolidate IT, Hires New CIO to Lead the Charge

April 25, 2017 - 11:05

Gov. Bill Walker signed an administrative order April 25 consolidating Alaska's IT and telecommunications assets to a standalone agency under the leadership of a state CIO. 

Until now, the functions had been under the purview of the state’s Department of Administration, which delegated technology responsibilities to executive branch agencies. That federated structure is blamed for what is described in a press release as “inconsistencies or redundancies,” which “hampered” efficient delivery of information technology services throughout the state.

"This administrative order will strengthen our IT functions, reduce overall costs, maximize efficiency, and allow us to tap into the talents of our entire team as we adapt to a changing world,” Walker said in the release. “This concept has been years in the making, and I’m proud we can move forward with smart moves to streamline the delivery of services.”

Administrative Order 284 outlines 12 key duties of the office, including the appointment of a chief information security officer, regular reporting to the Department of Administration, and regular statewide review of systems and programs. 

After creating the agency via the administrative order, Walker tapped Bill Vajda, a technologist with ties to the Bush and Obama administrations and a wide breadth of experience in government and industry, as the state's first CIO to lead such an effort.

"Bill Vajda has served as the CIO for the U.S. Department of Education and the acting CIO for the National Security Agency, as well as serving in several policy functions in the White House. Bill’s depth of knowledge and experience will be valuable as we begin this process,” Department of Administration Commissioner Sheldon Fisher said in the release.

The process is the result of legislative action dating back to 2014, which tasked the Department of Administration to produce a strategic path forward for Alaska’s IT infrastructure.

Vajda's predecessor, Jim Bates, served in the role from May 2013 to February 2016, with Information Technology Officer Jim Steele acting as an interim director after Bates' resignation.

In an exit interview with Government Technology last year, Bates said he had worked on new legislation that will make the office stronger for more comprehensive IT governance statewide as well as an IT inventory analysis that will allow the state to streamline procurement.

How Cities and Automakers See California's Autonomous Vehicle Regulations

April 25, 2017 - 09:30

SACRAMENTO, Calif. — On March 10, the California Department of Motor Vehicles (DMV) released its revised regulations on driverless cars. The new rules eliminate the requirement for a human driver to be present inside autonomous vehicles (AVs), prohibit manufacturers from advertising autonomous vehicles that do not meet the DMV’s definition and force the AV producer to submit a 15-point self-certify safety checklist to the National Highway Safety Traffic Administration as well as the DMV.

The release of the newly proposed guidelines kicked off a 45-day public comment period during which industry representatives, advocacy groups and local governments were able to make their views known in order to influence the final product. On April 25, DMV Director Jean Shiomoto, DMV Chief Counsel Brian Soublet and DMV Deputy Director Bernard Soriano held another public hearing in Sacramento to conclude the public comment period.

Startups, automotive giants, transportation network companies, several California cities and individual residents provided their views on the proposed regulations.

Local Government Input

Four cities sent representatives to the Capitol: Los Angeles, San Francisco, San José and Beverly Hills. “Cities are where the costs of automation are born. We are building the infrastructure, installing the signals and responding to the crashes,” said Jennifer Cohen, director of government affairs for the Los Angeles Transportation Department. “However, with a carefully regulated policy, we will also be reaping the benefits of safer streets, cleaner air, increased mobility and decreased congestion.”

In order to fully realize the benefits of AVs, the technology must be able to fit into the existing system of infrastructure and mobility. “Autonomous vehicles need to support our efforts to build safe streets and achieve our Vision Zero traffic goals,” said San Francisco Municipal Transportation Agency Director of Sustainable Streets Tom Maguire. One suggestion to ensure safety offered at the hearing was to limit the speed for AVs to 25 mph on city streets.

Both Los Angeles and San Francisco argued that there are two main factors that will help cities maximize the benefits of AVs. “Cities have an obligation to provide safe, well-run infrastructure and cannot do that without access to data and street management,” said Cohen.

Due to the ability of AVs to constantly measure congestion and any potholes they encounter, this data is invaluable for local governments. With it, they would be able to plan more efficient infrastructure projects and be alerted to necessary maintenance.

The information is also crucial in designing safer streets. “Cities need comprehensive data on disengagements and collisions in a standardized format on a regular basis,” said Cohen. Armed with the data on where AVs are forced to forfeit controls back to a human driver, cities could examine why it may be happening. And although AV permitholders are required to log any disengagements and provide that data to the DMV, it is only done annually, which is too infrequent to be useful to cities, argued Maguire.

AV Manufacturers Weigh In

Several representatives from the auto industry appeared at the hearing, and their message, not surprisingly, was that some regulations go too far. One issue brought up by Ford’s Andre Welch was the suggestion that all types of vehicles adhere to the same AV permitting process.

Truly recognizing the benefits of AVs, Welch explained, means exploring the possibilities for “autonomous multi-passenger shuttles.” Last fall, the company acquired Chariot, a Bay Area-based startup that specializes in on-demand shuttles.

Matthew Burton, legal director for Uber, requested a similar action from the DMV. The company bought Otto, a self-driving freight truck company, in August 2016. “We believe testing permit regulations are adequate for all motor vehicles,” he said. "There is no need for a separate permitting process for freight vehicles."

Soublet responded to the request by stating that the DMV intends to work on a separate permitting process for freight trucks and vehicles over 10,000 pounds after the smaller vehicle regulations are complete.

One issue brought up by several auto manufacturers involved reporting regulations for replacement parts that may improve the design or safety of the vehicle. Paul Hemmersbaugh of General Motors, who previously worked as chief counsel for the National Highway Safety Traffic Administration, said, “Our suggested changes would ensure that technology improvements would not be unduly delayed by amendment applications and the accompanying 180-day administrative review process.”

In order to get this life-saving technology to the public as quickly as possible, safety improvements need to be made rapidly, with an explanation provided to the DMV at a later date, said Hemmersbaugh.

Ron Medford of Waymo, the spin-off from Google’s self-driving car project, iterated some of the same concerns as other automakers, but added one unique request.

In the proposed regulations, a remote operator must be able to take over an AV in case of an extreme emergency. This can either be a communication link for someone to guide a user on how to bring the vehicle to a safe stop, or a remote controller of the vehicle. Medford requested that the "remote operator" functions can be performed by multiple persons within a single entity.

CIOs Dig Deep on Emerging Tech, Collaborating with the Feds and Identity Management at NASCIO Conference

April 25, 2017 - 07:30

ARLINGTON, Va. — Whether it’s protecting critical infrastructure from hacktivists and bad actors, working to securely authenticate identity management for constituents or implementing emerging technologies that are poised to change the world, as one tech chief put it, the IT officials and private-sector partners who attended day two of the 2017 NASCIO Midyear conference on April 25 got real about what they’re facing.

In “learning lounges” designed to offer a more informal approach to important discussions, they shared experiences and challenges, and offered advice and support on how to move forward on the various issues.
 

Cyber and State, Fed Collaborations

Hacktivism has made headlines for its disruptive and highly personalized nature — and it’s something with which Oklahoma CIO Bo Reese is quite familiar. In 2015, a Republican lawmaker in the state introduced a bill that made it illegal to wear a hoodie in public, which got a lot of attention — including from hacktivists who pointed botnets at the state’s infrastructure.

And hacktivists, said Andre McGregor, director of security at Tanium, are constantly searching for vulnerabilities. “They’re looking for something they can use to take control,” he said. “They won’t go after your most important machines; they’ll go after the ones you don’t know about.”

Cybersecurity also was one of the four concrete examples CIOs pointed to as far as their partnerships with federal agencies that continue to benefit states. More specifically, Connecticut CIO Mark Raymond discussed that the Center for Internet Security’s Multi-State Information Sharing and Analysis Center (MS-ISAC) is a partnership all levels of government should consider, and not just for its threat monitoring and advisories.

“When we were developing our cybersecurity strategy,” Raymond said, “we had the MS-ISAC come in.”

Also cyber-related is Minnesota CISO Chris Buse’s use of the federal Scholarship for Service program, in which students studying for degrees in cybersecurity get their college expenses covered if they work in government. The general rule of thumb, Buse said, is one year of college is paid for by one year of service in government. And students can get credit for up to three years.

“This is a really big deal if you’re in the government space looking for cybersecurity, because you have a captive audience," he said. "This is a really nice way to get top talent into government.”

Switching gears, Ohio CIO Stu Davis shared his experience working with 18F to craft a data analytics RFP. 18F Director of State and Local Practice Robin Carnahan helped turn the RFP process upside down in the state, which Davis said was looking to get proposals from innovative firms that don’t normally do business with government. “It’s going to be an interesting process as we go through this,” he added.

And Deputy Associate Administrator Dominic Sale noted that the General Services Administration, which houses 18F, can help states in other ways. The Federal Risk and Authorization Management Program (FedRAMP), for instance, is something states can take advantage of today, as are the agency’s resources around identity management. For example, best practices at IDmanagement.gov are available to all states. Sale said that officials can take this system and apply it in their states.

Sale also noted that Login.gov — a single sign-on solution for government websites — launched this week for some federal partners, though he has not yet reached out to states on the effort. “Believe me,” he said, “I'm working on that.”
 

Who Are You?

On the path to citizen-focused digital government, one significant challenge is how to securely authenticate the identity of those who want to conduct their public business online. Thanks to funding from a program called NSTIC, the National Strategy for Trusted Identities in Cyberspace, 15 states are currently engaged in pilot programs to test new technologies that can better protect against tax fraud — an expensive problem across the public sector. 

Georgia Chief Technology Officer Steve Nichols was on hand at a NASCIO panel Tuesday, April 25, alongside vendor MorphoTrust, to talk about their ID management pilot program, which involves the Department of Revenue, Department of Driver Services and Georgia Technology Authority. 

“One of the fundamental issues everyone has to grapple with is identify-proofing — your business process for proving that you’re you or I’m me,” Nichols said in an interview with Government Technology in advance of the panel. 

Georgia’s pilot takes advantage of the identity proofing that already goes into getting a driver’s license — a system that happens to be from MorphoTrust. Taxpayers seeking to protect their tax return from getting into the wrong hands can opt into the system using an app, which requires authentication via selfie. The submitted picture is compared to the driver’s license photo the state has to ensure a citizen is who they say they are. 

But Nichols cautions that many current pilots may not make it past the pilot stage. The costs are simply too high. “It’s pretty tough to make a business case for this stuff,” he said. “None of this would be happening without the NSTIC grants, that’s for sure.”


Changing the World with Tech

Lexington, Ky., CIO Aldona Valicenti sees great potential for emerging technology in local government. “IoT opportunities are driven by citizen needs,” she said, pointing to sensors as a relatively low-cost technology that can pinpoint services like garbage collection and leaf collection — a service that Lexington performs for its residents. “The very basic services that aggravate people … those will be the things that are going to drive IoT investment and it’s going to be at the city level.”

Among the other technologies discussed during the panel were drones, digital assistants, block chain, virtual and augmented reality, and connected and autonomous vehicles — all of which have potential in government and many of which are in use today. The conclusion attendees kept returning to, however, is that solid policy has to be in place to resolve the many issues presented by new tech. 

But it can be tough to move at the speed of technology given the legislative constraints CIOs operate under. Texas CIO Todd Kimbriel described the state’s three-year budget process, adding that agile development methods are helping his agency be more responsive. Government Technology interviewees at the event unanimously agreed that CIOs should be at the center of the conversation on incorporating emerging tech into government. Its potential to upend how government does business can’t be overstated.   

“Block chain will have the same disruptive effect as virtualization,” Kimbriel said, eventually enabling things like online voting. “Think about how that changes the world.”

IT's Role in Solving Pennsylvania's Budget Problem

April 25, 2017 - 07:00

Pennsylvania Gov. Tom Wolf's State of the State address in February underscored the commonwealth's lingering budget woes, encouraging decision-makers to support policies that would put them on a "sustainable fiscal course." 

At the 2017 NASCIO Midyear Conference this week, CIO John MacMillan talked about a recently embarked-upon shared services project that will help move Pennsylvania back into the black. 

Skyhigh Becomes First Cloud Access Security Broker in Cooperative Purchasing Program

April 25, 2017 - 06:00

Skyhigh Networks, the first cloud access security broker to achieve cybersecurity compliance across the federal government, has entered into a partnership with the government purchasing cooperative CIS CyberMarket.

The company offers a security layer for cloud services in the form of control points. Using Skyhigh, information technology workers can define user roles, set permissions and track data. Security has been a concern of government officials looking to move systems to the cloud, especially when it comes to sensitive or personally identifiable information.

It’s the first time CIS CyberMarket, which serves state, local and tribal governments, has offered a cloud access security broker (CASB) to its members.

“When organizations move to the cloud, they leave behind the network perimeter and need a new point of control for all their users, devices and cloud applications,” said Jon Fyffe, Skyhigh’s director of U.S. state, local and education, in a press release. “Skyhigh’s CASB evolves security from prevention to protection, enforcing context and content-aware policies wherever data travels in the cloud.”

Most members of the cooperative purchasing group have plans to move to the cloud, according to the statement, but need security assurances before they can do so. As part of the partnership, CIS CyberMarket has negotiated discount rates for Skyhigh.

“Skyhigh’s CASB allows us to strategically use the cloud without compromising security and governance,” said Missouri Chief Information Security Officer Michael Roling in the statement.

Skyhigh is the only CASB to achieve authorization through the Federal Risk and Authorization Management Program, which federal agencies use to assess cloud vendors. State and local governments also often use FedRAMP as a proxy standard for buying cloud services.

Cybersecurity, Agile Development Top of Mind at 2017 NASCIO Midyear Conference

April 24, 2017 - 08:00

ARLINGTON, Va. — Virginia is a leader when it comes to cybersecurity; just ask Gov. Terry McAuliffe, who told the audience at the 2017 NASCIO Midyear conference on April 24 that as chair of the National Governors Association, he has “made cybersecurity the No. 1 signature issue for all 50 states.”

McAuliffe also told the more than 530 government and industry attendees, whose discussions revolved around agile development, and top strategies, management processes and solutions, that although cybersecurity is an issue for all levels of government, “Washington has done a very poor job of outlining a national strategy for how we take care of the states. We don’t even have a committee in Congress; we at the states are left to do that ourselves.”

And as a nation, he added, we are only as strong as our weakest link. Though we have made tremendous progress, there is still much to do, as several states have been hacked in the very recent past.

“If something happens in your state and your individual taxpayers’ information is taken from them, you are going to pay a price for that,” McAuliffe said. “And you should pay a price. My most important task as governor is protecting the data that we have in Virginia.”

And his CIO, Nelson Moe, elaborated in a later session, noting that Virginia is the first state to adopt a cyberplatform and move forward with its sharing organization for cyber. “The key in Virginia is to be prepared; Mike [Watson, state chief information security officer] and I work on our incident response plans all the time,” he said, and mentioned that Virginia gets attacked every three seconds. “And the Internet of Things makes a larger attack space and decreases cost for the bad guys; it costs them less to create a problem for us.”

In Michigan, CIO David Behen and his team are working to combine mobile first, big data and cybersecurity with MiPage — what he described as a personal concierge for government services in the state that is personalized and predictive.

“It’s personalized data for you,” he said. “How are we going to use data to fundamentally change how we do customer service? … If you don’t have cybersecurity, how are you going to be sure you're protecting that data?”

Because if there’s a breach, Behen said, constituents’ confidence in your system is gone. And the state intends to solve the problem through public-private partnerships. In fact, in a few weeks the state plans to release an RFP for the Michigan Threat Analytics Center, where predictive analytics will show officials what threats the state will face next. A playbook on the concept will be released simultaneously.

Mississippi CIO Craig Orgeron noted that when it comes to implementing projects and programs, the state utilizes public-private partnerships as well. “We try to do the things that we are good at, and we try to partner where we need to partner, exploit those relationships,” he said.

Also a high priority was agile development, which is making its way into many state-level projects. In California, for instance, Deputy CIO Chris Cruz mentioned the state’s history of developing “these big, monolithic projects” where halfway through, something unforeseen would occur or the budget was already blown. It was these situations that prompted state officials to look at the project delivery process and take a different approach.

And the approach taken for California’s Child Welfare Services-New System project, which Cruz said is the largest in the country, is agile.

Cruz was first introduced to agile in Health and Human Services. “We were taking a waterfall approach to an agile project, so I think that helped us expedite an agile approach,” he said, adding that a great benefit of this approach is that if vendor A is not working out, “we can hire vendor B within a week or two.”

Preliminary results of a NASCIO/Accenture study found that an agile approach helps states achieve more of the results they want. More specifically, 74 percent of respondents found that agile supports increased customer engagement and business ownership, 71 percent found improved customer satisfaction and 68 percent experienced improved quality when using agile development.

For Cruz, one statistic in particular resonated — 65 percent of respondents found that agile supports improved transparency. “A lot of project directors tend to over promise and under deliver; we want to under promise and over deliver, which is doable with agile.”

One thing that Minnesota CIO Tom Baden noted wasn’t included in the preliminary results of the study is that agile removes a lot of the friction between all of those accelerations; small increments are worked through, which he said makes the project less risky and of better quality.

“If we’re a little off course we can adjust quickly; there’s less risk and greater flexibility,” he said. “But if you don’t have great leadership the whole way, you won’t succeed.”

For Accenture’s Keir Buckhurst, there’s not necessarily less risk, “but what you’re risking is a lot smaller. It’s the flexibility that’s what’s most important.”

As for how states are approaching agile, it’s a bit of a mixed bag. “Some states are supporting agencies as they dip their toes in the water, some are more active advocates … while others start a grass-roots effort to grow it across the state,” Buckhurst said. “Then some CIOs have said to me candidly, ‘Waterfall isn’t working, but I haven’t figured out agile yet.'”

One thing to consider for those who’ve not yet gotten their feet wet with agile yet, he said, is to get some assistance.

“It is critical to have an agile coach particularly through the first few initiatives,” Buckhurst noted. “They’ve actually done it in the field and can tell them how to do it.”

How Agile Is Ushering Millennials into IT in Nebraska

April 24, 2017 - 07:30

ARLINGTON, Va. — At the NASCIO Midyear conference in Arlington, Va., on Monday, April 24, agile development was on the agenda in a big way. Identified as a top 10 priority for 2017 for the organization, the incremental delivery strategy is bringing a pretty significant side benefit to the workforce in the state of Nebraska, according to CIO Ed Toner. Turns out agile is the preferred development strategy for millennials — a demographic CIOs are anxious to incorporate into their workforces. 

When Toner needs a little inspiration, he told Government Technology, he visits the team of millennials working on the state's enterprise content management system on the floor below the CIO's office. They've logged 20,000 hours of development time to date, Toner reports, and they're excited about the opportunities working for the state has provided them. So much so that now they circle back to Southeast Community College to encourage others to join them. 

Atlanta’s Internal Platform Turns Public to Provide Real-Time Commuter Data Following Bridge Collapse

April 24, 2017 - 07:20

Videos of motorists driving through billowing black smoke were the signature images when a bridge collapsed on Interstate 85 in Atlanta recently, but in that crisis, city officials seized opportunity — going live 12 days later with a dynamic website that uses data in new ways to battle gridlock.

No one was hurt in the dramatic chain of events during evening rush hour on Thursday, March 30. But the fire underneath the elevated highway segment, which is alleged to have been illegally set, spread to plastic conduit being stored underneath by the Georgia Department of Transportation (GDOT) — burning so hot that it brought down the roadway above. I-85 is expected to remain closed in both directions north of downtown until at least June 15.

On Monday, April 17, work on a gas line buckled lanes on Interstate 20, closing it southeast of the city center for nearly two days. In a case of good timing, public and private agencies had debuted CommuteATL.com, their new website optimized for mobile use, the previous week.

Metrics recorded the day of the I-20 shutdown showed more than 6,000 users had already visited the site. By Thursday, April 20, an average of 1,200 views per day had swelled that total to 9,500.

CommuteATL is powered by technology from Redlands, Calif.-based GIS software provider Esri and partner Waze, creator of a real-time, crowdsourced navigation app.

But it incorporates key information, including GDOT 511 camera footage at intersections, and four city data layers: the latest from Waze, real-time Metropolitan Atlanta Rapid Transit Authority (MARTA) train arrival times, Relay Bike Share station locations, and the latest from the 2.7-mile Atlanta Streetcar loop.

It lets drivers and city officials see traffic jams, closed roads, accidents and alerts as they happen. City planners can communicate with Waze about the routes it creates for users and share citizen feedback. Members of the public get a dashboard with live updates on traffic data that let them pick the best path through the city — whether by foot, bicycle, train, streetcar or vehicle.

In a tweet April 14, Atlanta Mayor Kasim Reed called the site “another tool in the tool box to help you better navigate traffic during this very difficult time.”

The website, a collaboration including the mayor, CIO Samir Saini, Chief Resilience Officer Stephanie Stuckey, Deputy Chief Operating Officer and Public Works Commissioner William Johnson, and GIS employees, began as something quite different. It first launched in early April as an internal-facing way for public works and joint operations center workers “to get an understanding of what was happening on the street,” Saini told Government Technology.

“But once we built that out for internal use we started looking at each other [saying], ‘Well this could be really valuable,’” he said. Conversations among members of the mayor’s task force, formed to deal with traffic congestion, helped identify the four main city data layers.

Stuckey contacted Esri’s Disaster Response Program, which offers cities consulting and technical support during disasters. The company's local government account manager, Rob Hathcock, said the process was streamlined by pre-existing contracts Atlanta had with Esri and Waze, which meant the city already had most of the tools it needed to create the platform. One exception, he noted, was amplifying its account with ArcGIS, the company's mapping and analytics platform, to host the services.

“When we first pulled in the disaster response team with the city of Atlanta, we kind of did a little bit of discovery, asking them what they wanted us to provide. They couldn’t mitigate the issue with the bridge. What they could do was show how the city responded to this, ‘What can we show our constituents after the bridge collapsed?’” Hathcock said.

In 2016, Atlanta was accepted into the 100 Resilient Cities network, a $100 million investment by the Rockefeller Foundation to help cities avoid being buffeted by physical, social and economic challenges. It also joined the Waze Connected Citizen Program, a free data-sharing partnership, last year.

Both partnerships helped fast-track the creation of CommuteATL, believed to be the first time a site has sent real-time information two ways: to Atlanta officials on one end and Waze users on the other, then letting City Hall communicate with Waze.

“If that’s true, that’s great,” Saini said. “You only really feel the value of these kinds of partnerships and this data-sharing in times of crisis, or at least it’s empathized a whole lot more. We’re proud of it.”

“What is coming to us is an official source of two buckets of information,” Adam Fried, global partnerships manager at Waze, said, describing the feed from City Hall of planned events like road closures and sporting events, as well as real-time events like weather, disasters and protests that affect transportation.

And there’s more to come from the website, which had start-up costs that were “virtually nil,” Saini said, consisting mainly of the extra staff time needed to stand it up.

He’d like to see CommuteATL become a real app in addition to an optimized website. And at least until I-85 is reopened, officials plan to continue adding data layers regularly while preserving an easy user experience. In the planning stages are ride-share and car-share information, MARTA bus schedules and more Relay Bike Share locations. Last week, the city announced the program’s expansion from 100 bikes at 22 locations to 500 bikes at 65 locations less than a year after its launch.

“We’re not going to pull the plug on it once the bridge is fixed,” Saini said. “We’re probably just going to ask citizens, ‘Where do you want us to take this thing?’”

What Are the Most Common Types of Special District?

April 24, 2017 - 06:00

Special districts are the most common type of local government by a large margin. But what are the most common kinds of special district?

That would be the fire protection district, according to the 2012 Census of Governments — though it really depends on how you slice it. That’s just the most common single type of special district. By category, the most common type of special district deals in environment and housing.

No individual kind of special district makes up a very large percentage of the total. These entities handle a large array of specific activities — there are 3,248 drainage and flood control districts, for example, and 3,438 housing and community development districts.

And those are just the big ones. There are special districts for hospitals, welfare, soil and water conservation, cemeteries and highways, among other things.

The interactive graphic below shows the most common categories of special districts, including breakdowns of every category that included sub-categories.

Special Districts
Create your own infographics

Analytics, AI and Orchestration are Top New Security Topics

April 22, 2017 - 08:00

I’m often asked what I like best about my job. One of my top answers is public speaking, learning and networking at security and technology events around the world.

Besides giving press interviews or speeches on cyberthreats, I really enjoy moderating panels and leading executive roundtables with public- and private-sector leaders at security and technology events. I often get asked to be a moderator for a few sessions at SecureWorld Expo events, InfraGard Conferences and regional technology forums, such as the upcoming MidWest Technology Leaders event.

During these panel sessions, the participants typically talk about a range of (hopefully intriguing) topics that include top cybercrime trends, cyberthreat intelligence, attracting and retaining cybertalent, big industry security breaches, internal security incidents or the always interesting (but overused question) “what’s keeping you up at night?”

Inevitably, security and technology topics include well known themes that I have written about such as ransomware, IoT botnets, cloud computing, smart cities, smartphone security, government CISO plans, securing the smart grid, end-user training, etc. Hopefully, we get beyond the problems and spend a few minutes on solutions. Nevertheless, the hopeful emerging technologies are often shortchanged in these panel discussions due to a lack of time.

Hazards on the Horizon Panel at SecureWorld Expo 2017 in Boston

Behind the Curtain

I often learn more in pre-event discussions, one-on-one CISO breakfasts and panel preparation sessions than I do during the actual sessions. There are many reasons for this, but most panelists want to talk about a set number of their company or government "talking points" that are pre-negotiated. Many CISOs and other tech leaders don’t want to discuss specifics about their company or difficult security situation in public, since stock prices, business reputations, brands and more can be impacted. In addition, as I have explained before, no security or tech leader wants to become an accidental news headline.  

Meanwhile, the audience tends to ask questions about breach headlines or recent headline technology outage incidents with major impacts — rather than seeking a deeper dive into emerging new technologies.

So what are the new cybertechnology solution trends I am hearing about the most in private? What cross-industry topics are on the minds of CSOs, CTOs and CEOs — besides their own specific enterprise issues?

The three cybersolution topics I hear most about during these pre and post-panel discussions are analytics (including metrics), artificial intelligence (AI) and orchestration. In order to honor the “off the record” aspects of these conversations, I won’t be providing names or companies regarding what I’m hearing.

Analytics, ‘Big Data,’ ‘Little Data’ and Cybermetrics

Without a doubt, the topic that every CISO has near the top of their “must do” project list is to do more with cyberanalytics. That is, do more with the data they collect and sector incident data gained through vendor and Information Sharing & Analysis Center (ISAC) partnerships.

There are many companies that offer solutions in this space. Teredata describes cybersecurity analytics in this way: “Big data and deep analytics provide high-speed, automated analysis for bringing network activity into clear focus to detect and stop threats, and shorten the time to remediation when attacks occur.”

Recently, CIO Magazine ran this article: Feds to battle cybersecurity with analytics. Here’s an excerpt:

With more real-time information sharing, officials envision cyber defenses moving from 'vaccine' to 'immune system,' a big analytics project that could achieve something like automatic security. …

Security firms offer a bevy of products that can intervene to mitigate the damage from a person clicking on a malicious link, [former deputy undersecretary of cybersecurity at the Department of Homeland Security] Phyllis Schneck said. But she envisions a much larger, global pool of threat data that could be tapped instantly and automatically to keep machines from falling prey to malicious actors, a system that would be aided by "big analytics" capabilities to make sense of the massive trove of data.”

Others think that “big data” is over-hyped, and we need to start thinking in terms of “little data.” Regardless of the approach taken, the discussion always leads to this wider cybermetrics topic with dashboards for management decision-making.

Another article from CSO Online reported that: Predictive analytics can stop ransomware dead in its tracks.” The article describes how Livingston County, Mich., has deployed predictive analytics as a defense against ransomware attacks.

But more than these two examples, I am hearing local, state and federal CISOs tell me that they are planning to do much more in their security operations centers (SOCs) with cyberanalytics products and services. How will this be done? There are numerous different approaches, but one set of solutions takes this topic to the next level with artificial intelligence.

Artificial Intelligence (AI) and Cybersecurity

Another topic that is hot right now is how will artificial intelligence (AI) help our cyberdefense efforts?

This recent article by Nasdaq.com describes how IBM’s AI is being used in the Department of Defense (DoD) because humans can’t keep up with cyberthreats.

In addition, “Aside from partnering Watson with H&R Block to process and analyze 11 million tax returns, the other major development has been the recent commercial release of cyber security by Watson to over 8,000 customers. With growing data sharing arrangements among members of the cyber security intelligence community, Watson was able to digest over 700 terabytes of data from just one partner (that is about 150,000 DVDs worth of data, enough to power Netflix for over 34 years without interruption). More data inputs only further empower the potential for AI in cyber security, allowing machine learning software to automatically detect, diagnose and counter cyber breaches in a more informed manner.”

I really like this article from earlier this year by SecurityWeek.com’s Torsten George on The Role of Artificial Intelligence in Cyber Security. The article describes three use cases for AI in cyber, including: Identification of threats, risk assessments and orchestration of remediation.

Here's an excerpt: “Too often, unsupervised machine learning contributes to an onslaught of false positives and alerts, resulting in alert fatigue and a decrease in attention. For opponents of AI, this outcome provides ammunition they typically use to discredit machine learning in general. Whether we choose to admit it or not, we have reached a tipping point whereby the sheer volume of security data can no longer be handled by humans. This has led to the emergence of so-called human-interactive machine learning, a concept propagated among others by MIT’s Computer Science and Artificial Intelligence Lab.

Human-interactive machine learning systems analyze internal security intelligence, and correlate it with external threat data to point human analysts to the needles in the haystack. …”

What Is Network and Security Orchestration?

The last area I hear quite a bit about from CISOs lately is network and cybersecurity orchestration. Like bringing together different instruments in an orchestra to produce beautiful music in a symphony, orchestration brings together diverse tools, processes and people to improve cyberdefense results and incident response to (hopefully) produce better results.

Security orchestration allows for automation and improved capabilities to navigate the full scope of security operations and incident response activities from the initial alert through remediation. This excellent 'Siemplify' article describes three aspects:

Context – understanding of the relationships across alerts, intelligence, and security data into prioritized cases with the complete contextual threat storyline. Automation – integrating automated capabilities in a flexible manner; from basic playbooks, to semi-automatic workflow, to complete automation of incident response where appropriate. One size fits all doesn’t work with security automation.  Analyst Enablement – giving analysts the proper tools and visibility to effectively intervene throughout the investigation and response process and ultimately ensuring we are curing the disease, not just the symptoms.

 

In this Network World article by Jon Oltsik from earlier this year, the state of incident response and security orchestration is described in more detail. He covers several vendor products and the outlook for the near future.

You can also learn more about the security orchestration market at this Business Wire article.

Final Thoughts

I am heading to another National Association of State CIO (NASCIO) Midyear 2017 meeting (follow at #NASCIO17 on Twitter) this week for discussions and networking with public- and private-sector partners. I always learn more about where things are heading in federal and state government cybersecurity and infrastructure at these gatherings, and we will be discussing many of same topics that I have written about in the past year.

One breakout session covers state government examples from what I think was the top cybertrend from 2016, namely Hacktivism and how hacktivists have been active all over the country.

In a keynote session, Virginia Governor McAuliffe is schedule to deliver some remarks, which will no doubt touch on cybersecurity and what is being done by governors through his NGA chair role.

But regardless of whether you will be at any of these security and technology events or not, I urge you to engage your team and vendors into deeper discussions regarding these three relatively new security topics. Analytics, AI and orchestration are already elbowing their way onto enterprise security agendas around the world, and regardless of the security problem — these topics are key pieces of cyberstrategy road maps and security solutions as we head toward 2020.

In conclusion, I started my industrywide 2017 cybersecurity prediction roundup at the end of last year saying cyberconcerns continue to escalate. And you ain’t seen nothing yet.

New Orleans Uses Targeted Approach to Spread Tech Awareness, Improve Digital Equity

April 21, 2017 - 12:00

New Orleans has collected 28 Requests for Information (RFIs) stemming from its Promoting Pathways to Opportunity Challenge, an initiative aimed at gathering ideas to improve digital equity within the city — and now officials are conducting an ongoing review of proposals to determine which may be a fit.

New Orleans CIO Lamar Gardere told Government Technology that the challenge's ultimate goal is to bridge the digital divide in his city and increase technology use among historically under-represented groups. Gardere said there are many different programs in New Orleans that address facets of the digital divide — one around broadband, one that provides technology training, another for workforce development — but what officials have realized is that regardless of how numerous these programs are, it’s difficult to get residents to adopt them.

“It’s well-known that in technology, there are groups of folks that are underrepresented,” Gardere said. “African-Americans tend to be underrepresented, women tend to be underrepresented, minorities in general tend to be underrepresented in technology, and what that creates is this reticence to adopt a program that’s targeted at you.”

What Gardere and those he works with learned was that often the people digital equity programs were designed to help saw the initiatives and assumed they were for someone else. So, they took a more specific approach.

New Orleans is one of the most singular cities in the United States, rich with a creative culture and residents who fuel it through music and other expression. Broadband and Internet access is key, but what Gardere described is an initiative that reached out to under-represented populations and spread awareness that they, too, deserve a place in tech.

A songwriter for example, may be using digital resources to record and store music, but he or she may not realize that the skillset they’ve learned doing this is applicable to a job in digital media. Another example Gardere gave was using telemetry to record activity on a basketball court, and then having experts review findings with the players about how fast they ran, how high they jump, whether their footwork was correct. A chief idea driving the effort to improve digital equity in New Orleans has been to “leverage something that people already enjoy doing,” he said, so they feel comfortable participating in tech.

“It’s about things that you would be looking at to improve how well you’ve been performing in the thing you enjoy — basketball — and at the same time you’re analyzing data about a situation,” Gardere said. “Paired with the appropriate person, you are now learning how to do data analytics. You are more prepared than you think to be a data scientist, and we want folks to realize that. We want folks to imagine themselves in those roles that they are perfectly qualified for, but may not really realize they are qualified for.”

Gardere and Deputy CIO Sara Estes White compare this kind of approach to sneaking vegetables into a smoothie. Everyone likes a smoothie, and, if made properly, one can get the vegetables they need from it as well.

New Orleans is conducting an ongoing review of the 28 proposals from the RFI, and, depending on what they find, the future may involve fostering partnerships among participants, bringing some in to work with the city or offering services such as space to work in recreational facilities. 

Educating Alexa: Feds Look to Make Public Service Info Available Through AI Personal Assistants

April 21, 2017 - 07:30

The federal government’s sometimes arcane and esoteric storehouse of knowledge and services could soon be voice-activated.

That’s because the General Services Administration’s Emerging Citizen Technology program — part of the fed’s Technology Transformation Service’s Innovation Portfolio — is piloting a new initiative to make public service information available through retail-level artificial intelligence-driven personal assistants (IPAs).

If the effort yields results, the Amazon Alexa, Google Assistant, Microsoft Cortana — even Facebook Messenger’s chatbot — could have easy access to an alphabet soup of agencies.

Or, as GSA put it in a declaration of principles: “These same services that help power our homes today will empower the self-driving cars of tomorrow, fuel the Internet of Things, and more. As such, the Emerging Citizen Technology program is working with federal agencies to prepare a solid understanding of the business cases and impact of these advances.”

The idea, the agency explained, is to explore opening its programs “to self-service programs in the home, mobile devices, automobiles” and elsewhere.

The pilot, which the federal government believes calls for swift development, is thought to yield “public service concepts reviewed by the platforms of your choosing” — whether you’re an Alexa, a Cortana or a Siri.

It’s also believed to generate a new field of shared resources — and educate the tech industry on working with the federal government.

Agencies that are listed as having requested to participate in the pilot include the departments of Energy, Health and Human Services, Homeland Security, and Housing and Urban Development; the Internal Revenue Service; the Law Library of Congress; and AIDS.gov, which will shortly become HIV.gov.

A quote on a Web page for the Consumer Financial Protection Bureau, another participant, indicates that it is “hoping to see how an IPA may better support our need to provide consumers” with financial education, choices and solutions.

And this is just the beginning of … the beginning. The pilot’s first phase focuses on the read-only use of public data, though federal agencies and providers are in talks to expand it.

Even the endeavor’s Wiki is in development, though content “should be populated” by Monday, April 24, officials wrote online, noting they’re “currently processing concepts and ideas from federal agencies and public services.”

Justin Herman, who leads the Emerging Citizen Technology program office, said Thursday, April 20 in a tweet that the effort will be open-sourced.

Open-sourcing the US Federal #AI Personal Assistant Pilot to open public services for Alexa, Cortana etc #govtech https://t.co/RSDJp2VOCA pic.twitter.com/toHx3kYkN2

— Justin Doc Herman (@JustinHerman) April 20, 2017

Its exact timeline is unclear but already underway.

The first stage, the agency said, is to identify stakeholders, roles and responsibilities — an effort for which it has budgeted “one week, starting now.”

Next steps include developing and implementing a compliance plan, for which another week is set aside; and holding a one-day development workshop to refine business cases and problems, analyze data and development requirements, and collect feedback.

Sacramento, Calif., Is Creating a Reference Architecture to Scale Up Autonomous Vehicles

April 21, 2017 - 07:30

“The message from all of us, was simply this: We are all in,” said Sacramento, Calif., Mayor Darrell Steinberg, recounting his meeting with public agencies, elected leaders and representatives from industry stakeholders discussing the plan to begin testing and deploying autonomous vehicles (AV) on city streets and freeways.

Gathered at the Golden 1 Center in downtown on April 19, Steinberg along with U.S. Rep. Doris Matsui, D-Calif., and Sacramento Kings owner Vivek Ranadivé announced their vision for why AV deployment makes sense for Sacramento. Steinberg and Matsui met last December with state Sen. Richard Pan and others to announce the city’s interest in developing a testing hub for self-driving vehicles and the creation of a working group to explore how to make this happen.

Four months later the coalition that was formed, the Autonomous Transportation Open Standards Lab (ATOS), presented a vision for the city’s leadership on autonomous vehicles to car manufacturers, transportation network companies, policy experts and regional agencies.

ATOS, the consortium of policymakers from local, state and federal government as well as representatives from private industry, is hoping to “develop an open standards lab and a protocol that achieves the delicate balance between ensuring that this technology is both safe and at the same time, ensure a regulatory environment where we are not stifling innovation,” said Steinberg.

“This organization is the first in the United States dedicated to speed the development of autonomous vehicle technology,” wrote Ranadivé in a Medium post. “We are creating the HTTP of autonomous vehicles — an open source platform that ensures city governments and private companies have a standardized and interoperable platform to build upon.”

The first standard that will always take precedence is safety, Steinberg told Government Technology. This technology has the potential to radically lessen the chance of serious injuries or fatalities caused by vehicle crashes. Citing that 94 percent of crashes are caused by human error, he said there is a tremendous opportunity that Sacramento can take advantage of.

Additionally Steinberg does not want to see the technology exclusively benefiting the wealthy. As the technology progresses, explained the mayor, we need to ensure that we “don't forget that as we scale this technology, that we are cognizant of serving communities and neighborhoods that have limited mobility options.”

Sacramento has “started creating a reference architecture,” said Ranadivé at the event. "We are creating a stack of what we need to do in order to make the city able to accommodate autonomous vehicles."

“Developing a set of protocols in the capital city can also create a bridge to help state policymakers and regulators adapt and amend their own views of how to regulate this new industry,” Steinberg said.

Among the factors that make Sacramento a desirable destination for AV testing — including the area's temperate weather, proximity to Silicon Valley and diverse population — Steinberg, Matsui and Ranadivé highlighted three: The physical size of the city and its downtown grid create an ideal testing environment, the city is home to state regulators and policymakers, and there is widespread support.

Steinberg referred to Sacramento as the "Goldilocks" of cities, saying, “It’s not too big, it’s not too small." Sacramento is the “perfect petri dish to not only test this new technology, but show how it can be brought to scale.”

The city is rapidly expanding its technological repertoire. Verizon recently chose Sacramento to help pilot its 5G network. The city also announced it’s a lead contender for a $44 million investment in renewable energy technology by Volkswagen to compensate for the company's 2015 emissions scandal. Both these projects “demonstrate that others are seeing Sacramento as a place to test new technology,” said Steinberg.

Having the state Capitol "at our backdoor" is perphaps the greatest draw for AV manufacturers, according to Matsui. “[Sacramento] has never taken advantage of its physical proximity to the center of state and national policymaking,” said Steinberg. We are within “walking distance to policymakers and regulatory bodies.”

As all the players get on board for turning the city into an epicenter for AV testing and deployment, there remain questions about liability and insurance. As a member of the House Energy and Commerce Committee, Matsui is in a unique position to help guide the city through any federal regulatory hurdles. “We want to be first, and we want to be best,” she said. “This is not a pipe dream”

Ranadivé even issued the King’s Challenge for the city. He is hoping to see 40 to 50 people (including himself, Matsui and Steinberg) driven in an AV to the first Kings game of the 2017-2018 season, which will begin in late October.

As a short-term goal that is great, said Steinberg. But he would like to see some serious progress within the next five years. He hopes the city can take advantage of the technology and “see at least one if not more [public agencies] employing this technology.”

“We’re not afraid to be audacious.”

Young Startup Wants to Train AI Better, Faster Using Synthetic Data

April 21, 2017 - 07:15

Artificial intelligence (AI) — of the “learning algorithms” variety, not the Skynet kind — is everywhere in the tech world right now. It’s because of the concept’s many possibilities: object recognition in pictures and videos, anticipating cybersecurity threats, finding specific kinds of people amid thousands or millions in a data set.

There’s also a fundamental need for all AI algorithms: training. They all need to run data to learn what it is they’re looking for. That’s how they “learn.”

What if there isn’t enough data to make the algorithms as good as they need to be? Or what if it takes too long to collect and prepare that data?

An early stage startup that just entered a northern Virginia cybersecurity accelerator thinks it has the solution: fake data.

Or, as Automated DL’s CEO Jeff Schilling puts it, synthetic data. His company has built up the capability to produce data — a lot of data — based on historical examples. It’s not real, but it mimics real data closely enough that AI algorithms can use it. And it’s realistic enough that it could be real.

It’s AI acceleration, basically.

“We want the AI people to get there better, faster,” Schilling said.

He described the process of creating fake data in terms of the process of humans learning to speak. They begin with sounds, then form approximate words, then learn to put those words in sequence via grammar. And just as humans use grammatic rules to put words together into novel sentences, Automated DL uses “grammar” to create novel data.

There are a few potential upsides to the concept. First, if learning algorithms hit the real world with more training under their belts, theoretically they will perform better. Second, the large amount of synthetic data Automated DL generates can be designed to reflect a wide array of possibilities — including rare situations that might not be represented in the real data.

Schilling pointed to the first known fatal crash of a Tesla in Autopilot mode as an example of where that might come in handy. In that case, the car was driving along a Florida highway when a semi-truck with a white trailer pulled onto the road in front of it. The sky behind the trailer was also white, and the Tesla’s software didn’t distinguish between the two.

Not that Automated DL’s technology could have necessarily prevented that specific incident. But Schilling said the event illustrates the broader need to train AI for scenarios humans might not think of on their own.

Third, Schilling said, many of the people creating AI are simply more interested in what their products are going to be able to do than they are with the task of hunting down data or creating their own.

“Making data for them is boring and a pain in the ass,” he said.

The accelerator the company is going through right now, Mach37, is oriented toward cybersecurity, and that’s a main focus for Automated DL at the moment. As part of that, they’re working with potential partners in the federal intelligence community.

However, Schilling stressed, there’s a wide ocean of possibilities out there for what the company is building. It’s hardware-agnostic and can be folded into other products. AI is growing quickly, and entrepreneurs have found a lot of different industries in which to apply it. So there’s no reason Automated DL wouldn’t work with state and local government — or the companies serving those governments. With a 20-year career at IBM on his résumé, Schilling said he’s well versed in the workings of sub-federal government.

And there are a lot of companies finding new ways to use AI to serve sub-federal governments. AppCityLife is building an AI chatbot for the NYC BigApps competition that could help immigrants access city resources. Pluto AI is working on software to help water utilities predict asset failure. SADA Systems has a platform that can analyze pictures of bridges and identify cracks. And those are just the startups.

Not to mention that cybersecurity, the company’s focus at the moment, is just a tad important to government.

Synergy Between Drones and Enterprise Asset Management Is a New Opportunity for Government (Industry Perspective)

April 21, 2017 - 06:00

If your job has anything to do with using technology to deliver smarter, more effective services — and today, every government job does — there’s never been a better time to work in the public sector. A mix of new, disruptive systems and services, from cloud computing and big data to business analytics and drones, are transforming the workplace and technology space we thought we knew.

Options and possibilities that would have been unimaginable five years ago are quickly becoming basic business tools. It’s a moment when multiple breakthrough technologies are evolving in parallel — all of them at unprecedented speed. At the core of the transition is an enterprise IT backbone that gives organizations the visibility, transparency, digital security and operational control to keep track of large departments or functions, make maximum use of every asset at their disposal, and deliver the responsive, high-quality services that increasingly tech-savvy citizens expect.

Managing Your Assets

In recent years, many more state and local government departments and agencies have become familiar with the basics of enterprise asset management (EAM), a system that invariably includes:

a database of all the available physical and financial information on an organization’s assets; software to execute key processes and workflows; barcode identification for all assets; and extensive use of mobile technology for field audits and day-to-day maintenance.

You would be hard pressed to find a more challenging EAM environment than New York City, with approximately 1 million buildings, 2.7 million vehicles entering the city each day, and countless smaller tools and devices in its inventory. Yet it’s never been more important to get value for money from public-sector operations, optimizing every asset to maximize service delivery, anticipate failures and minimize down time.

That’s why forward-looking government managers are moving their EAMs to the cloud, taking advantage of the opportunity to streamline operations in response to “skinny” budgets. Cloud computing also brings an end to the era of local modifications, freeing up on-premises development teams to take on more specific, focused assignments. The takeaway: When the going gets tough, the tough (and the smart) take their EAM systems off-premises.

Cloud-based EAM is the glue that holds the components together. The system identifies, tracks, locates and analyzes an agency’s physical assets and provides the planning and decision tools needed to optimize performance. By generating precise maintenance schedules that minimize unanticipated down time, cloud-based EAM maximizes operating efficiency and makes best use of the labor and materials that are in short supply. Add Internet of Things technology to the mix, and you can build a sensor network that covers critical systems to predict maintenance and repair needs before they become obvious.

Eyes in the Sky

The latest technology opportunity for government managers is the powerful synergy between EAM software and drones. Whether your facilities and other assets are on land, sea or air, they’re often in distant or awkward locations that are difficult or dangerous to reach by road. Your mission may depend on the reliable operation of aging equipment or devices, and the farther they are from the beaten path, the more it costs in time and trouble to organize an unscheduled maintenance trip.

Tethered drones give you a 360-degree view of all your assets, delivering essential data and insights to drive an effective asset management plan. Whether your asset inventory includes trains or shipyards, aircraft or pipelines, buildings or bridges, the synergy between drones and EAM helps improve inspection processes, boost asset performance, and comply with the regulations that govern your work.

Two points that cloud computing and drones have in common: They both emerged in the blink of an eye and are becoming essential tools of the trade for government employees. If your challenge for the foreseeable future is to meet mounting public and stakeholder demands for service excellence, while managing aging infrastructure and limited budgets, the first step is to optimize the assets that you control. And that means tapping in to the cutting-edge technologies that can help you get the job done.
 

What's New in Civic Tech: Philadelphia Blog Captures Stories of Residents Who Used Its Open Data Portal to Do Good

April 20, 2017 - 14:00

What's New in Civic Tech takes a look at highlights and recent happenings in the world of civic technology.  

Philadelphia Illustrates Positive Uses of Open Data in the City

Philadelphia has launched a blog to look at ways its residents are using open data in their everyday lives to make the city a better place.

This project is a collaboration between Philadelphia’s Office of Open Data and Digital Transformation and Temple University’s department of journalism. At the blog's core is recent Temple graduate Kamal Elliott interviewing Philadelphia residents who have used open data in a meaningful way, with an overarching goal of showing others how tangible such pursuits can be.

In a post announcing the project, officials said topics will include how residents have used data to plan programs, grow businesses, advance advocacy and build apps. The blog will also look at how government has benefited from citizens having access to data.

Philadelphia, like nearly all major cities in the US, has had an open data portal for some time. Theirs is OpenDataPhilly, and this blog will illustrate ways that it can be used as part of a broader nationwide trend — a trend where civic technologists are no longer content with just spilling data sets online, so they are moving on to conveying data in ways that more people can digest and understand. Philadelphia’s blog is unique, in that it seeks to use narratives and journalism to inspire greater use. Other cities, including Chicago, Boston and New York, have recently overhauled their data platforms to make the interface simpler.

In addition, Philadelphia is also soliciting user input for a beta version of a new website it plans to launch soon.  

Director of San Francisco’s Startup in Residence Program Leaves for New York City

Jeremy Goldberg, formerly the director of innovation partnerships and the Startup in Residence program in San Francisco, has left his position to become the managing director of NYCx, an initiative based in the New York City Mayor’s Office of Technology and Innovation.

Goldberg began work in New York on Monday, April 17, announcing the change on Twitter.

 

Excited to join @MiguelGamino & the @nycforward team #Day1 #govtech #breakthroughtechnology pic.twitter.com/wgcO5o3sXw

— Jeremy M. Goldberg (@JeremyMGoldberg) April 17, 2017

 

Goldberg’s move follows that of Miguel Gamiño Jr., who left his position in October 2016 as San Francisco’s CIO to become New York City’s chief technology officer. In the wake of Goldberg’s departure, those interested in the Startup in Residence Program are asked to contact Amardeep Prasad.

With Goldberg’s involvement, STiR grew from a citywide initiative in 2014 to an annual program in four regional cities, including San Francisco, Oakland, San Leandro and West Sacramento. The idea behind it is a simple one: Embed startups in compatible government departments where they can do the most good. Through STiR, technologists work first-hand with government employees, gaining the insight they need to identify and tackle problems they likely could not have seen from the outside.

STiR participants have seen much success. One such story is Binti, which has been rapidly signing California clients after its time in STiR helped it develop digital tools to make the foster parent application process far more efficient for both social workers and potential caretakers. Felicia Curcuru, Binti’s chief executive officer, has attributed involvement with the program to her company having now signed more than 20 government clients throughout California.

Chicago Launches Redesigned Data Portal

Chicago has launched a redesigned open data portal that seeks to both increase information and make it simpler for users that aren’t data scientists, joining a number of major cities that are overhauling their platforms to be more accessible to residents.

The Chicago Data Portal’s relaunch took place Monday, April 17, and the main page of the site featured data sets most likely to draw interest from residents, including the locations of restaurants with sidewalk café permits, information about taxis, restaurant inspections and police incident reports. The platform is a marked improvement in both aesthetic and utility over the one Chicago built in 2012 after an executive order from Mayor Rahm Emmanuel required an open data portal. There is also a marked increase in visual elements on the new site, including maps and graphs.

“Open data is not just for the tech-savvy, but for all Chicagoans and visitors to the data portal,” Chicago officials wrote in a statement announcing the launch.

The website is the culmination of feedback gathered about Chicago’s open data portal over the five years in which it was been available. The feedback was solicited in focus groups that were conducted in partnership with groups such as Smart Chicago Collaborative and ChiHackNight. In September 2016, the new design was also shown at Woodson Regional Library in the city’s Washington Heights area in order to collect feedback.

To continue to alleviate concern about accessibility, Chicago has created a YouTube channel with tutorials about how use its open data portal. Residents can also follow Chicago open data developments through an ongoing blog.

Pages