top of page

SECyber's Response to the DCMS Online Harms Consultation


This government has committed to annual transparency reporting. Beyond the measures set out in this White Paper, should the government do more to build a culture of transparency, trust and accountability across industry and, if so, what?

There is a tendency across HMG initiatives to lower the barrier for new start-ups and SMEs. Small and Micro organisations today are able to create and/or access low cost platforms and storage with significant reach with just one or two founders.


These smaller entities are also frequently part of the supply chain and therefore introducing additional risks to the end client, their suppliers and partners.


This lenient approach is outdated - particularly for the tech sector DCMS is targeting - and the SMEs and micros who commit to being compliant with existing legislation and regulation are being disadvantaged.


We have regularly been contacted in confidence about these initiatives being anti-competitive.


This issue is particularly important today as many start-ups - including those claiming to be cyber security specialists - are visibly not compliant with the legal requirements for their own companies thus impacting trust and accountability in the market. 


Innovative solutions do not need to be insecure, and by insecure we do not mean conformance to a Code of Conduct, we mean technical security testing - and by technical security testing we are not referring to annual pen testing and quarterly vulnerability scans!


Should designated bodies be able to bring ‘super complaints’ to the regulator in specific and clearly evidenced circumstances?

We don’t have enough information about what a ‘super complaint’ looks like to provide an informed opinion.


Question 2a: If your answer to question 2 is ‘yes’, in what circumstances should this happen?

Not answered


What, if any, other measures should the government consider for users who wish to raise concerns about specific pieces of harmful content or activity, and/or breaches of the duty of care?

Tech companies already have to comply with current legislation and regulation as well as sector-specific contractual clauses, best practice and various Codes of Conduct. As experienced practitioners in technical security, risk assessment, the legal profession and law enforcement we strongly recommend you review existing legislation, regulation, Standards,


Best Practice, Good Practice and Codes of Conduct to avoid confusion and duplication.


We also strongly recommend you collaborate with the Crown Prosecution Service and police to understand why clear existing breaches of existing legislation and regulation relating to serious online harms including but not limited to death threats, rape threats and CSE-related content, appear to be ignored and do not result in effective outcomes and justice.


As parents and victims, we are baffled by the fact that prior to online social networks it was a case of our word against theirs - we now have clear audit trails with screen shots for evidence but little if no assistance from the authorities when we need it the most. We wonder how an additional regulator will help with these two key areas of concern.


What role should Parliament play in scrutinising the work of the regulator, including the development of codes of practice?

Not answered


Are proposals for the online platforms and services in scope of the regulatory framework a suitable basis for an effective and proportionate approach?

Whilst we appreciate the intent behind reducing the burden on smaller companies (see our earlier comment), tech companies claiming to provide and operate secure systems or even just compliant with minimum standards, should be able to cooperate with law enforcement investigations already.


If organisations were compliant with the mandated legislation, regulation, standards and contractual clauses they claim to be (and have to be), admissible evidence would be accessible to prosecute offenders.


Unfortunately, many ‘secure’ systems we see are not secure and therefore unable to provide admissible evidence even if they collaborated with law enforcement. In the case of CSE, this is also exploited as offenders are able to claim the evidence gathered from their insecure network/system cannot be attributed to the suspect/s.


We recommend this ongoing non-compliance issue be tackled first. 


In developing a definition for private communications, what criteria should be considered?

Not answered


Which channels or forums that can be considered private should be in scope of the regulatory framework?

If the online activity is illegal, it’s illegal… see our previous comments. How it’s tackled for a private channel/forum needs to be in scope regardless of how difficult a task it may seem.


What further steps could be taken to ensure the regulator will act in a targeted and proportionate manner?

We are confused by the section risk-based approach and how this will be achieved in reality.


To enable the regulator to ‘tackle harms that have the greatest impact on individuals or wider society’ you will be dependent on submitted evidence which will need to be submitted by people who are aware of this initiative.


These case studies/evidence will then need to be assessed and verified which takes time and expertise.


Online harms that have a significant impact on one group of individuals may not be of concern at all to another group.


There is discussion of focussing where there is actual evidence of harm yet surely where there is evidence of harm there is evidence of breaches of legislation and regulation already? We are also concerned by the references to GDPR and the ICO. This proposed approach is not comparable to GDPR.


What, if any, advice or support could the regulator provide to businesses, particularly start-ups and SMEs, comply with the regulatory framework?

With reference to our previous comments, small and micro organisations today are able to create and/or access low cost platforms and storage with significant reach with just one or two founders.


These smaller entities are also frequently part of the supply chain and therefore introducing additional risks to the end client, their suppliers and partners.


We understand the reasoning behind lower bars during innovative initiatives, however, when organisations are live and dangerous and interacting with real people, they should not be exempt from compliance with legislation and regulation.


This is particularly important when the potential impact is e.g. rape, murder or terrorist activity.


Should an online harms regulator be: (i) a new public body, or (ii) an existing public body?

We are not yet clear what the benefits of the proposed role would be. More information is required.

Question 10a: If your answer to question 10 is (ii), which body or bodies should it be?

Not answered


A new or existing regulator is intended to be cost neutral: on what basis should any funding contributions from industry be determined?

Not answered


Should the regulator be empowered to i) disrupt business activities, or ii) undertake ISP blocking, or iii) implement a regime for senior management liability? What, if any, further powers should be available to the regulator?

Some organisations rely on these platforms for their marketing and engagement activities so disrupting the platforms’ activities would be unfair on those unsuspecting organisations.


We are unclear as to what ‘implement a regime for senior management liability’ is bit it sounds like an insurance package which would be unacceptable as it implies purchased immunity from prosecution.


The Board is already jointly accountable and responsible – much higher than the ‘senior’ level. In smaller organisations, the accountable person will be easier to identify and address.    


Should the regulator have the power to require a company based outside the UK and EEA to appoint a nominated representative in the UK or EEA in certain circumstances?

Not enough information to provide a response.


In addition to judicial review should there be a statutory mechanism for companies to appeal against a decision of the regulator, as exists in relation to Ofcom under sections 192-196 of the Communications Act 2003?

Question 14a: If your answer to question 14 is ‘yes’, in what circumstances should companies be able to use this statutory mechanism?

Question 14b: If your answer to question 14 is ‘yes’, should the appeal be decided on the basis of the principles that would be applied on an application for judicial review or on the merits of the case?


Unanswered as it refers to, and follows on from, the previous question.


What are the greatest opportunities and barriers for (i) innovation and (ii) adoption of safety technologies by UK organisations, and what role should government play in addressing these?

Government must acknowledge that cyber-related organisations including start-ups and SMEs are specialist security companies – and that means they must comply with existing legislation and regulation.


Encouragement of fast growth of companies that claim to provide secure services and products to other organisations, but which fail to operate securely and compliantly themselves is both hypocritical and dangerous.


Furthermore, during recent years these non-compliant have been given an unfair competitive advantage, putting pressure on existing compliant companies, some of which have now ceased trading. We strongly recommend this ongoing issue be reviewed.


What, if any, are the most significant areas in which organisations need practical guidance to build products that are safe by design?

In a word - consequences. Operating outside of legislation and regulation is common practice in all sectors. It is not difficult to build safety and security into a product, but it is difficult to add it on afterwards.


Should the government be doing more to help people manage their own and their children’s online safety and, if so, what?

There is a significant amount of information available online. However, today’s children understand how to configure their devices better than their parents and teachers as they grew up with technology.


This is a generational issue and we strongly recommend DCMS considers which sectors really are vulnerable, now and in the near-future.


The parents and teachers who are not comfortable configuring devices today will be the elderly generation who are forced online in the near-future, but weren’t brought up with tech, unlike their children.


What, if any, role should the regulator have in relation to education and awareness activity?

Not answered

bottom of page