We introduce the Relevance Assessment Tool (RAT), which allows researchers to design complex search engine retrieval effectiveness studies. This Web-based tool consists of different modules that researchers can use to design tests, to collect results from different search engines using a screen-scraping approach, and to collect judgments from a multitude of jurors using a crowdsourcing approach. We designed the software as a Web-based application, which allows for a distributed collection of relevance judgments, and the like. Using the Relevance Assessment Tool allows for much larger search engine studies than previously conducted. To our knowledge, there is no comparable tool that was developed in the academic context. Therefore, no information regarding the design of such software was available to-date.