“Your RPA | Intelligent Automation project will almost certainly fail if you don’t mitigate against these risks right now”​ – Part 5 of 6

Back to Blog
"Your RPA | Intelligent Automation project will almost certainly fail if you don't mitigate against these risks right now"​ - Part 5 of 6

“Your RPA | Intelligent Automation project will almost certainly fail if you don’t mitigate against these risks right now”​ – Part 5 of 6

If organisations don’t want their RPA and IA program to fail then they need to plan for, and mitigation against, risk. Part 1 looked at strategic and implementation partner risks. Part 2 outlined tooling selection, testing and stakeholder buy-in risks. Today we look at people, cultural and financial risks. Part 3 called out people, cultural and financial risks; today we look at expectation and execution risksPart 4 examined expectation and execution risks. Today we look at Scale and Security risk.

Scale Risk:

1. Underutilised software robots | not enough robots (hire, utilisation reports, ignoring 15-1 BS

2. Created automation islands – without a Centre of Excellent islands of uncoordinated and unconnected automation can proliferate. This can lead to unnecessary duplication of effort, duplication of environment, underutilised bots, alternate coding standards

“Be careful if the process is automating any data to do with GDPR (or other standards/regulations like 27001, SOX, PCI, HIPAAetc.) and make sure your technology and your bots can be compliant. Even an RPA recorder which takes a screenshot of the screen as you build your bot can be a breach, so ensure you either have specific fake records to develop against (so the screenshot is OK) or turn on the secure recorder feature if your platform has it.

Be very careful with logs and error screen snapshots. It is so useful to log everything to aid testing and diagnosis of issues, but it needs to be compliant. If compliance and InfoSec departments find out there is PII data sitting in a log or screenshot somewhere it usually isn’t pretty and can kill a project. It is much better to seek advice from compliance and InfoSec before you build and define best practices and policies on logging, data storage, and Bot Runner storage access controls early.

You tend to find they are so pleased to be consulted early on and are very helpful as a consequence. Bottom line, if you are just starting out, build compliance and Infosec into every aspect of all projects from the get-go.”

Simon Frank , RPA Lead

3. Not treating as a critical platform

4. Considered only RPA or a limited number of digital or emerging technologies.

5. No RTO RCO plans and robots stops – everyone has forgotten how the process works

6. Momentum – Completed a proof of concept but did not have the next set of processes to automate lined up ready to go resulting in your program stalling and people losing interest.

7. Seeing digital workers as technology not a capability and capacity expansion device.

8. HR continue to work as before and continue to hire people without checking if digital workers and AI could complete the task instead.

9. Build a program on a desktop and did not treat the RPA platform as a business critical platform

10. Focused on automating tasks not end to end processes.

11. Understand your infrastructure is it virtual, or local or a hybrid of both?

12. Did you build your infrastructure and Bots to mimic your production environment?

13. Are your servers that will host the Bots high-density (HD) or not? How will you scale if you purchase or spin up more Bots?

14. Most RPA platforms, have updates twice a year or more. How do you plan to test those and update your internal platform?

15. How are you managing your RPA pipeline? Is there a ticketing system, a hub, or a planned path teams take to get in front of your COE team?

16. Have you thought about Bot maintenance? Hard drive, memory usage, video or space constraints?

“Future proof your operating model, don’t just redesign your operating model for today.” 

Janine Gill, Director Intelligent Automation

Security Risk:

Numbers of changes to the to IT environment need to be made to enable bots to work. For example, windows group policy changes are required to avoid things that cause bots to break (e.g. stopping screen locking, pop up messages, automatic reboots, logging off bots after X amount of inactivity and so on). But just because business introduces digital workers does not mean that all the very best security practices should disappear. Organisations need to adopt a ‘Zero Trust’ principle when it comes to bots and bot code execution.

1. Cloud introduces flexibility and agility but can bring additional data protection risks. Organisations need to make use of cloud vendors high availably, disaster recovery and security features as they rollout cloud solutions.

2. Hard coding credentials into bot processes; ans. credential vaults are also important so that folks looking at RPA scripts etc don’t get credentials. Single Sign On should be used so that Active Directory security permissions are passed on to the bots.

3. Not segregating duties and allowing bots to complete end to end processes where people are not permitted to do the same. Ans: Internal audit controls should be introduced for BOTS as part of IT audit and new policies and procedures should be defined by the compliance team. There should be no super user bots. Bot should mirror SME access and segregation of duties access rights

“Today’s understanding of cyber security in the context of Intelligent Automation is misconstrued. Cyber Security is not just your “brick and mortar” password security. This transcends data access, user/robot access, encryption (Data at rest, Data in transit), passwords, logs access and etcetera. To win in the digital world, organisations need to be vigilant and security aware to avoid data breaches or reputation damage. Security is a team sport – employees, CISO, IA vendors all have a part to play.

  • Are you performing your due diligence before any IA vendor lock-in?

Probe your IA vendor and mitigate the future security risks:

  • Is your IA vendor security certified and compliant – GDPR, SOC 2, HIPAA?
  • Can your IA vendor support existing Active Directory Integration or Cyberark password management or similar technology?
  • Does your IA vendor support multi-factor authentication?
  • Does your IA vendor use the right protocols – TLS 1.2?
  • Is your route to the cloud secured?”

Tolani Jaiye-Tikolo, Senior Developer and Writer

  1. RPA application, program and security logs should be reviewed regularly. Use of an RPA Log Immutability solution, to safekeep the logs or even program the RPAs through blockchain technology to ensure they are not being tampered with. Logs should not contain sensitive data such as credit card numbers or personal identifiable data.
  2. Credentials must be kept secure. Organisations should use credential vaults for managing passwords and multi-factor authentication when bots attempt to login to IT resources. Assigning a unique identity to each RPA bot . Bot security and access rights should be documented and revised frequently. Bots should have owners. Then their owners should be made responsible and accountable for their network and appliance access compliance.
  3. Detailed knowledge of applications automated should be be required for a developer to predict possible security issues though InfoSec are better placed to help test for application and hardware security vulnerabilities.
  4. Avoid granting your robot write access to databases, try to have read-only access. Databases themselves, should be encrypted both at ‘rest’ and ‘in transit’. Database logs should be reviewed frequently to determine access rights and actions to ensure that the actions are approved and secure.
  5. Integration with third part applications, service providers or cloud services should be approached with caution. Data security and data privacy cannot be guaranteed when data leaves your organisation.
  6. Code review is very important in the development stage to ensure that design and implementation follow best practices. Closing processes out cleanly and completely when it has executed is key. Developers must ensure no floating data exists once processes are executed so that the data cannot be read at a later date.

There are inherent risks in any sizeable technology, transformation or business program. Organisations need to accept and plan for risks so that they can avoid or mitigate them.

Let me know your thoughts in the comments below 👇

My Expertise: I’m an intelligent automation, data analytics, robotic process automation, and digital transformation expert. For the past 25+ years I have been driving business transformation across a range of industries using; common sense, digital technologies, intelligent automation, data analytics, artificial intelligence, and robotics process automation. This has generated millions of dollars of value. I solve complicated problems others can’t. I am happy to help advise you to support solve your unique business challenges.

If you enjoyed this article then you may enjoy these 11 articles too.

  1. RPA and IA wont sell itself; you have to do it – Author – 4 part series
  2. Building an Automation Centre of Expertise | An Experts Guide – 5 part series
  3. If your RPA program is not making money then it has failed.
  4. RPA – Proof of Concept (POC) or Proof of Value (POV)? Who cares, just get going!
  5. 40 Essential Selection Criteria to Choose an RPA Platform – 5 part series
  6. I meet 150+ developers and these are 20 signs of a truly gifted developer
  7. The A-Z of Robotic Process Automation, Intelligent Automation and Digital Transformation
  8. How to scale successfully – you have 60 seconds to reply
  9. Can organizations implement RPA without having a digital transformation strategy – what would you have said?
  10. FREE training sites for Robotic Process Automation, Intelligent Automation, Data Analytics, Artificial Intelligence & Digital Training Sites
  11. 22 way to cut the cost of an automation program – 4 part series

If this could benefit someone else tag them and share this.

Free to reuse: We are a community of RPA, analytics, digital and intelligent automation experts with years of real world experience. We have stories to tell and the scars to show for it. We share our collective wisdom for free to simply provide as much value as we can to you. Therefore, if you want to post this article on your LinkedIn page then please feel free to do so. The more information we share within the RPA community the more likely businesses are to succeed with this excellent technology.

Further Help: If I can help you in any way please do reach out.

Note: The views expressed above are our views and not those of my employer or the employers of the contributing expert

Share this post

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to Blog