- University of Kent
- School of Computing
- People
- Sophie Kaleba
I am part of the PLAS research group and I am currently investigating phase-driven dynamic optimisations. I am advised by Stefan Marr and Richard Jones.
At run time, the behaviour of a program is bound to evolve; for example, it can first start collecting data, then calculating statistics, to finally generate the related output. Similar behaviours can be observed repeatedly during the program lifetime: they can be grouped into phases, that can repeat several times at run time.
Phase detection is a dynamic analysis technique that aims at detecting, for a given program, the different phases it is divided into. This analysis faces several challenges: it has first to determine what a phase is, i.e. identifying the behaviour(s) which are being monitored and the relevant metrics corresponding to these behaviours. The other challenge is to determine when the program is experiencing a phase change, i.e. going from one phase to another.
Phase detection can be performed either online and/or offline, and can be either software-based and/or hardware-based: for instance, some phase detection techniques monitor instruction working sets where phases are determined according to the number and types of instructions executed. Other techniques monitor the execution frequency of specific code sections of the application or conditional branch counts.
Dynamic runtimes aim for performance, especially execution time-wise. High run-time performance is usually achieved through the use of intricate kinds of dynamic optimisations. Furthermore, these dynamic runtimes are likely to run in a concurrent environment, or on heterogeneous hardware.
Most of dynamic optimisations, however, have an overhead, because they both profile and optimise at run time. To address this issue, phases have been used as indicators to decide whether a specific optimisation should be triggered and phase detection techniques have been used to guide compiler optimisations and, more generally, program optimisations. For instance, phase detection techniques have already been applied to the optimisation of temperature control, or to just-in-time compilation. These techniques identified for instance phases relying on hardware counters or method lifetime.
These works showed promising results in terms of performance. Nevertheless, most of them have been conducted more than 10 years ago. Since then, new dynamic optimisations have been developed and already established dynamic optimisations have evolved to become more complex, especially for dynamic languages such as JavaScript where JIT compilers are used. As a consequence, applications were allowed to become richer, and their behaviour is now more complex and they have many more abstraction layers, which need to be compiled away for performance. With this complexity in mind, it is likely that phase detection, and phase-driven optimisations could still have a strong impact on run-time performance.
The goal of my PhD is to investigate how nowadays systems can benefit from phase-related information, with a focus on performance.
I am a member of the following research groups:
I am mostly interested in dynamic languages, their behaviour and runtime, and how to optimise them.
I first got involved in this field during my Google Summer of Code project, when I was happy to work on extending the virtual machine profiler used in Squeak and Pharo. I also had a glimpse of the dynamism of R during my MSc internship in the Parkas research group where I analysed R applications, and notably most of their dynamic features. My MSc thesis is available here.
I have taught:
· CO658: Programming Language Implementation
· CO659: Computational Creativity
Loading publications...
Showing of total publications in the Kent Academic Repository. View all publications