Given the development of a new cyber strategy by the Australian government, it’s worth considering a few of the topics it’s likely to tackle.  This piece gives some thought to intelligence sharing, prompted in part by the extensive effort evident in the 2022 NSA Cybersecurity Year in Review.

In the 2020 Cyber Security Strategy, the Australian government some $37 million explicitly to information-sharing, mostly on a cyber threat intelligence-sharing (CTIS) portal under prior development, with a small amount ($1.6 million) to help universities improve their own threat sharing. 

An independent assessment of the value of such initiatives may help inform the strategy. There has been no reporting on progress in the university community. The CTIS portal, a hub enabling bi-directional flows of data in machine-readable format, went live on 1 November 2021, providing over 12 months’ worth of operations.

There is potentially much goodness in sharing intelligence, not least because readiness and defence against cyber threats is not limited to intelligence agencies.  Shared intelligence should improve situational awareness, enable better targeted defence for both government and business, and help minimise the duplication of effort. 

Government has the greatest means of collection and a responsibility for ensuring security and the well-being of the state and its citizens, and while government may have a right to secrecy, the public has a right to know. Differentiating between the intelligence itself and the means of collection—‘what we know is not as sensitive as how we know it’—can facilitate needed sharing. 

Still, simply pushing data is not enough. In the fast-paced, highly interconnected digital world, actionable ‘knowing’ requires trust and relevance


Trust—or more accurately, trustworthiness—matters. It is not simply one attribute determining the actionability of shared intelligence—‘can I trust this data?’. It is the pillar on which successful partnerships are based—‘can I trust you with my information and to act in my interests?’.

In an intelligence-sharing partnership, the data that companies share may present privacy risks—for themselves, their staff, their partners, and their clients. While governments fears exposure of secrets, companies fear information being used against them, with risks to their competitive position and reputation.

Sharing with government is no panacea for industry. The relationship is inherently unequal.  Government has given itself broad coercive powers, reaching deep into business operations and development across a swathe of sectors through the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018 and Security of Critical Infrastructure Act 2018. Given such powers, government must work hard to show that it is a trustworthy actor.

Yet industry outreach, with the rest of the ACSC, has been subsumed into ASD—the CTIS has been similarly subsumed into the REDSPICE initiative—effectively conflating industry assistance, surveillance, and intelligence. Such structural arrangements erode the governance that helps inform judgments of trustworthiness. Industry apprehension spikes when ministers use retributive language and reach instinctively for regulatory tools.


Different industries have differing structures, drivers, business models, and underlying technologies. There will be a broad base of commonality—NotPetya demonstrated how malware may cross sectors and borders with ease, for example. Threats affecting internet of things devices or industrial control systems may be much harder to capture and relevant to a much smaller subset of organisations.

It would be unreasonable to expect ACSC and ASD to have deep technical knowledge of a wide variety of bespoke operational systems. The primary focus of their defensive efforts is government itself, and as ongoing audits report, that remains a tough ask.

So greater value may be found through collaborative, peer-to-peer sharing within sectors, given the industry-specific nature of operational systems. Care is needed: such arrangements, however, may run aground on regulatory and anti-competitive concerns, particularly opportunities for collusion.

Further, what may work well for large, established companies may impose barriers to entry for newer, smaller players—or increase the prospect of predatory poaching and acquisition of small businesses.    

That lends greater weight to the not-for-profit industry-based information sharing and analysis centres (ISACs) prevalent in the United States—see, for example, ISACs in the financial services, the defence sector, supporting information technology companies, in the health sector, in the automotive industry, and space sector

ISACs provide an interface between government and companies, disentangling intelligence and surveillance from assistance and outreach. They focus on industry-specific operating environments, helping relevance, help manage reputational risk, and support all members, including small and medium-sized businesses.

Relevance is also shaped by the ability of the receiver to understand and act on the intelligence in a timely manner.  Actionability assumes commonalities of language, data interoperability, shared threat perception, systems support and human capability. 

Such conditions are likely more the case for large than for small and even medium-sized organisations. Without them, sharing is slowed, more likely reliant on manual approaches, and increases the prospect of miscommunication. 

Complementary activities to assist small and medium businesses, and those with bespoke and niche technologies (often of interest to acquisitive actors) are needed. The NSA report outlines numerous initiatives, including working with cloud providers, developing ‘consensus-based, industry-driven’ standards and offering its own no-cost cybersecurity services to selected small business. 

Some thoughts on recommendations

The hubs-and-spokes model of information sharing exemplified by the ACSC’s CTIS can be described as ‘weakly successful’. It’s a reasonable start but, alone, suffers from systemic weaknesses that will limit its utility. Strengthening mechanisms and opportunities means more than offering technical solutions: social, political and economic drivers and consequences shape the management of cyber.

First, there is trust: government needs to do more to show it is trustworthy, particularly in terms of the data it gathers. 

That means untangling outreach and sharing from surveillance and the work of intelligence—in short, better governance, a different structures and greater accountability. Alternatives abound, from statutory agency to a separate ministry or an analogue of the Hybrid CoE, complete with its guarantees of independence and inviolability of data.

Second, there are the limits to the scope and scale of expertise. Government agencies do not hold—and cannot expect to hold—all the intelligence and knowhow needed to meet the challenges posed by cyber across every sector in the economy and society. 

Encouraging third-party not-for-profits to deepen the well of expertise relevant to industry sectors would help build expertise and resilience across the broader economy—the EU cyber agency offers an ISAC-in-a-box guide.  That will need ongoing government co-investment to build critical mass and sustain ongoing adaptation—it’s not simply a matter of transferring cost and risk to industry, as has been past practice.  

Last, there’s differentiation, diversity and depth. Industry is not homogenous. What works for one sector and for large organisations may not suit and even impair others. 

A better understanding of industry sectors, business operations and interdependencies would help government structure more targetted mechanisms and assistance. That’s the sort of skill more likely found in Treasury and the Industry Department than ASD or Home Affairs.

In short, government needs to mature its approach. Intelligence sharing will be increasingly important. Government has to adapt to a dynamic, essentially infinite cyber environment while working with a changeable but lumpy economic landscape and with sceptical stakeholders. It will need a more structured, more differentiated, and a trustworthiness-oriented approach to be successful.