6. CONCLUSIONS AND FURTHER WORK
Despite the caveats identified previously, the framework is a versatile platform enabling experiments to be designed according to user-defined protocols. One example is determining the standard deviation from the repeated execution of malware, hence providing an estimate of uncertainty. Indeed, Hubbard (2014, p. 162) points out that where there is a lot of uncertainty in a quantity, then very little data is needed to reduce the uncertainty significantly. Hence, producing an estimate of the expected number of artefacts to be observed significantly reduces the uncertainty in what is expected from subsequent observations.
Support is currently limited to tools that examine file, registry, and network artefacts, such as those in the Sysinternals suite (Microsoft, 2020). The tool under test must be capable of being initiated and configured (if needed) via command line. To capture the output of the tool, it must also provide a means to export a log file (in any text-based format). Future work could extend the support to GUI based tools.
The framework is in an early stage of its life-cycle and so further work to critically review and (where appropriate) adopt the framework elsewhere would contribute towards addressing the requirements of the framework relating to critique and general acceptance.
Additional further work could also include the development of an offline oracle. This would provide greater control over parameters, such as execution times and the number of runs to better define and hence establish ground truth. A community validated oracle of reference data would provide the greatest level of confidence in the results of the MATEF and could be based on existing projects such as MAEC (Kirillov, Beck, Chase, Martin, 2010) and (‘YARA’, n.d.). Improving how ground truth is determined in this way also has the potential to improve the validation requirements of the framework.
There is also scope to harden the virtual environment to minimise detection and evasive behaviour by malware. Furthermore, future work can be done to validate the framework by implementing and testing it with empirical data. Finally, there is room to also engage with practitioners and other stakeholders to gather feedback on the identified requirements and design.
Table 4. Review of requirements.
Table of Contents
- 1. INTRODUCTION
- 2. BACKGROUND AND RELATED WORK
- 3. REQUIREMENTS FOR CONDUCTING MALWARE FORENSICS
- 4. DESIGN OF THE FRAMEWORK
- 5. DISCUSSION
- 6. CONCLUSIONS AND FURTHER WORK
- 7. REFERENCES