As board member at Daimler for integrity and legal, Renata Jungo-Bruengger is looking carefully at the implicatons of big data across the enterprise and for customers, and of artificial intelligence.

Screen-Shot-2018-02-09-at-4.37.59-PM-300x194.

Daimler’s Jungo-Bruengger says ethics and law tend to trail technology developments (Photo: Claus Dick)

Many IT projects raise key questions, for example, in the wide-ranging field of artificial intelligence, says Renata Jungo-Bruengger, the Daimler board member responsible for integrity and legal affairs. That’s why the attorney wants to promote legal clarity on complex IT issues. 

Jungo-Bruengger, who is Swiss, has a law degree from the University of Zurich. She joined Daimler’s legal department in 2016 and has been the board member in charge of integrity and all legal issues at the premium car group since January 2016.

Ms. Jungo-Bruengger, how significant are the mutual reservations that the business and legal sides at Daimler have — with regard to good governance?

Renata Jungo-Bruengger: (laughs) That’s an interesting way of putting the question. You immediately get an image of dangerous attorneys, of being lectured, of the strict compliance manager. That’s not the case. I would like to make legal consulting business-oriented. And – if I may say so – it works. If you look at how my colleagues in this area work with the business side, you’ll see that everyone is pulling together. There are no mutual reservations.

Please give us an example…

Take autonomous driving. Attorneys and engineers are working very closely in this area, for example, on product safety issues or in compliance consulting. I find that employees are thankful when they are advised. And it is important to me that our coworkers within the company understand the strategies of the group, especially if people are making inroads into new business fields. There is a great need for consulting in this regard. The same applies to issues surrounding data privacy.

What is the situation with classic legal issues?

Of course, we also have training sessions on the classic legal areas. We use a wide variety of tools for this. We’ve just launched a new, web-based training program that is built in modular form. There are mandatory portions, but I can also look for the fields of activity that are important to me and assemble them appropriately for my purposes. The rationale behind this: We want to sensitize our employees to legal issues.

One of your stated goals is to turn integrity and values into a competitive advantage for the company. How is that supposed to work?

I am convinced in general that the company culture is an important and central issue in today’s regulatory environment, and we have to rethink these matters. Incidentally, this also applies to sustainability. The potential for competitive advantages can also be found there. The point is that I can’t just push a button and set the company’s integrity and values. There are ongoing processes behind them, and they have to worked out. You have to trace a wide arc and define the common values under it. In the end, it is not always clear to employees what is right and wrong. Reality is not always black and white. We provide help with this. We ultimately define a standard that an employee can apply on the basis of a common understanding – at any time. One thing is clear: Today, a large company like Daimler cannot afford to do without compliance.

What would be an example of a value?

Mutual respect, openness, fairness and transparency are important. As you know, Daimler is working on a new culture called Leadership 2020. Among other things, these values flow into it.

Together with development chief Ola Kaellenius, you lead Daimler’s Corporate Sustainability Board. What does this panel manage and monitor?

The board consults on the company’s sustainability strategy. It decides how we communicate with our stakeholders on these topics. But none of this is new. Daimler has been carrying out a sustainability dialog for nearly 10 years, discussing issues that are critical to society. These include fields such as autonomous driving and environmental protection. We also maintain close contacts with NGOs and regularly exchange views with them. In turn, teams from the various departments operate below the board level, and the departments help us define values.

You are convinced that jurists should actively shape digitalization. How do you handle that?

We have a defined IT compliance system and specialized staff that are involved with the relevant issues within the context of IT projects, for example, in the wide-ranging field of cloud applications. You can imagine that there are a great many relevant legal questions to discuss in this area alone. And we are building up tremendous expertise within the company for consultations on these matters. But it is not easy because specialists and attorneys who are somewhat knowledgeable about IT are in high demand at the moment.

Furthermore, we have to prepare for new issues. I’ll give you an example: Artificial intelligence is being thoroughly discussed in the legal community. Among other possibilities, will I be able to arrange for AI to check simple contractual work, general terms and conditions, in the future? Or will you be able to draft even more complex things with AI’s help? In fact, attorneys have to play an active role in shaping these matters and create clarity at the same time.

How many professionals work in your area?

We naturally have overlapping assignments in the compliance organization. But in my area, about 30 specialists currently handle IT, data privacy and data governance issues. As far as IT goes, the number is about 10 to 15 who especially take care of these related sets of issues. Furthermore, we also turn to external support.

What IT issue currently poses the greatest legal challenge?

Without a doubt, a central issue is how to deal with big data. How do we categorize the massive data that will accumulate in the future? What tools do we use? Not only that: We are facing ever more stringent data privacy regulations. The data security aspect is extremely demanding – customers must be confident that their data are processed properly in accordance with data privacy and that data security is guaranteed. Our job is to structure the associated processing procedures clearly and make them transparent so that the customer knows what personal data are being processed and how he can give or withhold his consent. Trust is important.

Do you believe that customers trust the auto industry at the moment?

Trust has been lost, of course. We have to work on that. Data privacy, data transparency, data security and customer self-determination are – as I just mentioned – enormously important to us. We want to re-establish trust and maintain it.

Is data privacy too stringently regulated in Germany? Doesn’t it potentially stand in the way of new business models?

That question is often asked, and it is not that easy to answer. On one side, people want to know what happens to their data. At the same time, they naturally want the ability to use all these digital services freely and quickly, and that’s why they often agree unhesitatingly to the general terms and conditions on Google and Facebook and for numerous apps. We have to deal with very ambivalent behavior. At this point, it is not always easy to find the right solution. If you value data privacy, the processes become somewhat more cumbersome.

And in the end, it is also a cultural issue. In Britain, for example, there is vehicle insurance based on the pay-how-you-drive principle. In Germany, these concepts – which rely heavily on user data – don’t yet work. In this country, we deal with the topic differently. Here the right to data privacy is actively asserted – despite all the ambivalence – and you have to take this into account. The situation is different in other markets. But we are dealing with a process of change in Europe since the new EU data privacy regulations are very strict. In other words, we all have to change our thinking.

On the topic of guidelines: Autonomous driving in particular requires new legal frameworks. Where does the discussion stand right now?

We have made considerable progress in Germany, which is one of the first countries to have a law on fully automated driving on the way. That’s great progress. It produces legal certainty for more developments. But there are certainly points that still need to be worked out – especially in data privacy. By way of background: We have come out in favor of a data storage device. But legislation still has to spell out how it should be handled. For us as a company, for example, it is important that the system be technically open and simultaneously have a specific purpose. As far as the much discussed liability issues are concerned, people are sticking with driver/owner and producer liability, that is, the classic legal model, which will function well in this area as well.

In the future, when machines take over tasks from people, there will also be questions of liability. But a machine is neither a legal nor a natural person in terms of the law. How do we deal with that?

That’s a good question. Exciting legal discussions are associated with it, and there are now numerous treatises on it by legal scholars. There are still no concrete solutions. But that’s only logical. To begin with, we have to discuss the ethical dimensions of these issues on a societal level.

You are addressing the ethical aspect: How hard is it for ethics and the law to keep pace with technological developments?

Technological developments are generally more rapid. It is actually not that easy to take action in step with them. In the case of autonomous driving, for example, things are working extremely well. We already have – as described earlier – a level of legal certainty in Germany and can adapt appropriately to the development. But at the same time, as a company, we must simultaneously cultivate a dialog and discuss the ethical aspects with the public. Legal certainty regarding a technology is of no use if society has not accepted it.