Michaela Greiler


Main focus: Software engineering

Twitter handle: @mgreiler

Websites/blogs: https://www.michaelagreiler.com , https://www.linkedin.com/in/mgreiler/

Languages: Dutch, English, German

City: Munich, Villach, Seattle

Country: Austria

Topics: software development practices, software engineering, technology, data analysis, data science, female entrepreneurship, software quality, code review, data-driven decision making, software testing

Bio:

I specialized in software process improvements. For the past 5 years, I have been working at Microsoft as software engineer and researcher. I worked with several product teams, such as Office, Windows, Exchange and Visual Studio at Microsoft to optimize their software development practices. I use a mixture of qualitative and quantitative data analysis techniques such as interviews, observations, surveys, as well as data mining and data analysis techniques to make teams more efficient and productive when it comes to software testing and code reviewing.

I hold a PhD degree in Software Engineering, and a Master and a Bachelor degree in Computer Science.

Examples of previous talks / appearances:

Keynote about Code Reviewing at Microsoft

This invited keynote was given at the International Conference on the Quality of Information and Communications Technology.

Abstract of keynote: Four eyes see more than two. Following this well-known principle, within Microsoft, code reviews are a part of the backbone of Microsoft’s quality culture.

Not only Microsoft bets on code reviews. Over the past decade, both open source and commercial software projects adopted code review practices as a quality control mechanism.

Even though code reviews have many benefits, developers spend a large amount of time and effort performing code reviews.

Therefore at Microsoft, we constantly seek to improve our understanding on the practice of code reviewing. The aim is to improve the productivity of our engineering teams.

We do so by analyzing millions of code reviews that are produced by our engineers. We complement this data by observing, interviewing and surveying groups of software engineers that participate in code reviewing.

In this talk, I will give an overview on Microsoft code review practices. I explain what we learned on why and how code reviews are performed. I show which benefits, challenges, do’s and don’ts come with them, and which open questions are still to be answered.

This talk is in: English
Keynote about Data-Driven Decision Making

Abstract: Tom DeMarco states that “You can’t control what you can’t measure”. But, how much can we change and control (with) what we measure? In this keynote, I investigates the opportunities and limits of data-driven software engineering.

I show which opportunities lie ahead of us when we engage in mining and analyzing software engineering process data.

I also highlights important factors that influence the success and adaptability of data-based improvement approaches.

In this keynote about data-driven decision making, I clearly show that understanding the collection process of the data is crucial. This ensures that we understand the quality of the collected data which influences the quality and validity of the analysis and outcome.

This talk is in: English
Technical talks about Code reviewing

I analysed and improved code reviewing tools and practices for several years, while working in the Tools for Software Engineering team at Microsoft and Microsoft research.

I gave several technical talks at various international conferences, and at company events at Microsoft Redmond US, and Microsoft Research UK. The audience was technical and the size varied between 20-300 people.

For a more complete list of the talks, please visit my website.

This talk is in: English
Test Confessions: What Eclipsers Think and Do About Testing

This 40 minutes talk was given for a technical audience of approximately 500 people at the biggest conference for Eclipse-based plug-in systems.

In the talk, presented an in-depth study on the testing culture within the Eclipse community, following the Grounded Theory approach widely used in the social sciences.

I highlighted what developers think and what they do when it comes to testing, the challenges they face, and how they address them. The findings include (often opposing) views on test automation, versioning, GUI testing, bug fixing, tooling, reviewing, and the role of the Eclipse community in quality assurance.

This talk is in: English
Technical talks about Software testing

I gave approximately around 25 technical talks at various international conferences about topics regarding software testing.
The audience was in 90% of the time an expert audience of researchers or software engineers. The audience size varied between 20 and 450 people.

This talk is in: English