\usepackage[style=numeric-comp,backend=bibtex]{biblatex}
\usepackage{amsthm}
\usepackage{todonotes}
+\usepackage{xspace}
+\usepackage{he-she}
\usepackage{verbatim}
\usepackage{minted}
\usemintedstyle{bw}
\newcommand{\var}[1]{\type{#1}}
\newcommand{\refactoring}[1]{\emph{#1}}
-\newcommand{\refactoringsp}[1]{\refactoring{#1} }
-\newcommand{\ExtractMethod}{\refactoringsp{Extract Method}}
-\newcommand{\MoveMethod}{\refactoringsp{Move Method}}
+\newcommand{\ExtractMethod}{\refactoring{Extract Method}\xspace}
+\newcommand{\MoveMethod}{\refactoring{Move Method}\xspace}
\newcommand{\citing}[1]{~\cite{#1}}
\newcommand\todoin[2][]{\todo[inline, caption={2do}, #1]{
\begin{minipage}{\textwidth-4pt}#2\end{minipage}}}
-
\title{Refactoring}
\subtitle{An unfinished essay}
\author{Erlend Kristiansen}
In the extreme case one could argue that such a thing as \emph{software
obfuscation} is to refactor. If we where to define it as a refactoring, it could
-be defined as a composite refactoring \see{intro_composite}, consisting of, for
-instance, a series of rename refactorings. (But it could of course be much more
-complex, and the mechanics of it would not exactly be carved in stone.) To
-perform some serious obfuscation one would also take advantage of techniques not
-found among established refactorings, such as removing whitespace. This might
-not even generate a different syntax tree for languages not sensitive to
-whitespace, placing it in the gray area of what kind of transformations is to be
-considered refactorings.
+be defined as a composite refactoring \see{compositeRefactorings}, consisting
+of, for instance, a series of rename refactorings. (But it could of course be
+much more complex, and the mechanics of it would not exactly be carved in
+stone.) To perform some serious obfuscation one would also take advantage of
+techniques not found among established refactorings, such as removing
+whitespace. This might not even generate a different syntax tree for languages
+not sensitive to whitespace, placing it in the gray area of what kind of
+transformations is to be considered refactorings.
Finally, to \emph{refactor} is (quoting Martin Fowler)
\begin{quote}
The word \emph{simple} came up in the last section. In fact, most primitive
refactorings are simple. The true power of them are revealed first when they are
combined into larger --- higher level --- refactorings, called \emph{composite
-refactorings} \see{intro_composite}. Often the goal of such a series of
+refactorings} \see{compositeRefactorings}. Often the goal of such a series of
refactorings is a design pattern. Thus the \emph{design} can be evolved
throughout the lifetime of a program, opposed to designing up-front. It's all
about being structured and taking small steps to improve a program's design.
Another result from the Miller article is that when the amount of information a
human must interpret increases, it is crucial that the translation from one code
to another must be almost automatic for the subject to be able to remember the
-translation, before he or she is presented with new information to recode. Thus
+translation, before \heshe is presented with new information to recode. Thus
learning and understanding how to best organize certain kinds of data is
essential to efficiently handle that kind of data in the future. This is much
like when children learn to read. First they must learn how to recognize
\url{http://visualvm.java.net/}} the software and having isolated the actual
problem areas.
-\section{Composite refactorings} \label{intro_composite}
+\section{Composite refactorings}\label{compositeRefactorings}
\todo{motivation, examples, manual vs automated?, what about refactoring in a
very large code base?}
Generally, when thinking about refactoring, at the mechanical level, there are
A composite refactoring is more complex, and can be defined like this:
\definition{A composite refactoring is a refactoring that can be expressed in
-terms of two or more primitive refactorings.}
+terms of two or more other refactorings.}
\noindent An example of a composite refactoring is the \refactoring{Extract
Superclass} refactoring\citing{refactoring}. In its simplest form, it is composed
\end{figure}
\section{Manual vs. automated refactorings}
-Refactoring is something every programmer does, even if he or she does not known
+Refactoring is something every programmer does, even if \heshe does not known
the term \emph{refactoring}. Every refinement of source code that does not alter
the program's behavior is a refactoring. For small refactorings, such as
-\ExtractMethod, executing it manually is a manageable task, but is still
-prone to errors. Getting it right the first time is not easy, considering the
+\ExtractMethod, executing it manually is a manageable task, but is still prone
+to errors. Getting it right the first time is not easy, considering the
signature and all the other aspects of the refactoring that has to be in place.
Take for instance the renaming of classes, methods and fields. For complex
it is to complex to be easily manipulated.
+
+\chapter{Related Work}
+
+\section{The compositional paradigm of refactoring}
+This paradigm builds upon the observation of Vakilian et
+al.\citing{vakilian2012}, that of the many automated refactorings existing in
+modern IDEs, the simplest ones are dominating the usage statistics. The report
+mainly focuses on \emph{Eclipse} as the tool under investigation.
+
+The paradigm is described almost as the opposite of automated composition of
+refactorings \see{compositeRefactorings}. It works by providing the programmer
+with easily accessible primitive refactorings. These refactorings shall be
+accessed via keyboard shortcuts or quick-assist menus\footnote{Think
+quick-assist with Ctrl+1 in Eclipse} and be promptly executed, opposed to in the
+currently dominating wizard-based refactoring paradigm. They are ment to
+stimulate composing smaller refactorings into more complex changes, rather than
+doing a large upfront configuration of a wizard-based refactoring, before
+previewing and executing it. The compositional paradigm of refactoring is
+supposed to give control back to the programmer, by supporting \himher with an
+option of performing small rapid changes instead of large changes with a lesser
+degree of control. The report authors hope this will lead to fewer unsuccessful
+refactorings. It also could lower the bar for understanding the steps of a
+larger composite refactoring and thus also help in figuring out what goes wrong
+if one should choose to op in on a wizard-based refactoring.
+
+Vakilian and his associates have performed a survey of the effectiveness of the
+compositional paradigm versus the wizard-based one. They claim to have found
+evidence of that the \emph{compositional paradigm} outperforms the
+\emph{wizard-based}. It does so by reducing automation, which seem
+counterintuitive. Therefore they ask the question ``What is an appropriate level
+of automation?'', and thus questions what they feel is a rush toward more
+automation in the software engineering community.
+
+
\backmatter{}
\printbibliography
\listoftodos