presenting research
Download
Skip this Video
Download Presentation
Presenting Research

Loading in 2 Seconds...

play fullscreen
1 / 15

Presenting Research - PowerPoint PPT Presentation


  • 115 Views
  • Uploaded on

Presenting Research. Dr. Anjum Naveed & Dr. Peter Bloodsworth. Discussion: What is research?. Discussion: How can research outcomes be communicated to others?. How to Make a Persuasive Argument (researcher’s perspective). Getting Started. Before you start: Have your literature review handy

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Presenting Research' - casta


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
presenting research

Presenting Research

Dr. Anjum Naveed &

Dr. Peter Bloodsworth

getting started
Getting Started
  • Before you start:
      • Have your literature review handy
      • Be very clear regarding what you are trying to prove
      • Re-read your problem statement / research hypothesis
      • List the contributions that you want to claim in your thesis
      • For each point in your list you will need evidence to convince others that you have achieved it
      • People won’t just take your word for it
      • The most common reason for a rejected paper is getting this wrong
learn from others
Learn From Others
  • Use your literature review
  • How have others doing similar work to you evaluated their research?
  • What metrics did they use and why?
  • Are there any common data sets that are used widely?
  • How much proof is normally expected / presented?
  • Some fields require more proof than others
compared to what
Compared to What?
  • Are there any systems that you can compare your work with?
  • Caution 1 : It might sound clever to pick a legacy system to compare against
  • Your system will look better – right?
  • WRONG!! : Researchers will spot this straight away and it will harm your credibility in the future
  • Caution 2 : Be sure that you make fair comparisons
  • Don’t deliberately pick an inappropriate system to compare against
  • Don’t choose a data set that favours your system
  • Don’t make sweeping statements with limited proof!
explicitly state assumptions and initial conditions
Explicitly State Assumptions and Initial Conditions
  • Carefully write down the assumptions that you have made
  • Are they reasonable?
  • Could others question them?
  • How would you answer tough questions?
  • Results need to be bullet-proof
  • What initial conditions were set?
  • Could they bias the results?
  • What have you done to avoid this?
  • Could other researchers repeat your tests to verify the results?
types of evaluation
Types of Evaluation
  • Two main types of evaluation:
    • Quantitative Evaluation
    • Qualitative Evaluation
  • The choice / mix of evaluation techniques depends on your thesis topic
  • Generally quantitative results allow us to make stronger claims
  • We have to be more careful when taking a qualitative approach can’t claim too much
  • Ask for advice from your supervisor before starting on this
  • Try to write an early paper to get some feedback on your evaluation technique
quantitative evaluation
Quantitative Evaluation
  • Numerical comparisons
  • System X performs 10% more accurately than System Y
  • The algorithm is 90% effective in classifying brain tumors
  • Makes use of statistical and other mathematical techniques
  • Includes formal mathematical proof
  • Logical proof and Model checking
  • Regression to identify trends
  • Is very powerful but care needs to be taken because errors can be very costly
  • Examiners tend to be very numerate and will spot mistakes
simulating results
Simulating Results
  • In some research areas you won’t have access to the required resources needed for testing
  • In such cases simulating or modeling can help us to generate quantitative results
  • Caution : The results will only be as good as our simulation / model!!
  • Use established simulation / modeling techniques and packages wherever possible
  • Carefully show that your simulation / model is accurate and that its configuration doesn’t introduce bias
  • Run several experiments and gradually increase complexity
what can go wrong
What Can Go Wrong?
  • Claiming too much, justifying too little
  • Using an inappropriate mathematical technique which introduces bias to results
  • Making a basic mathematical error
  • Selecting data that isn’t representative of your problem domain
  • Choosing data which is biased in some way
  • Not doing enough testing
  • Not having a large enough data sample
  • Misinterpreting results – missing the point or drawing wrong conclusions
what can go wrong1
What Can Go Wrong?
  • Not using metrics that are expected in your domain
  • Misunderstanding metrics and applying incorrectly
  • Choosing the wrong simulation tool and trying to force it to fit your problem
  • Badly configuring your simulation / model so that it doesn’t really describe your problem
  • Doing things in a hurry at the last minute increases all of the above risks – take your time!!
qualitative evaluation
Qualitative Evaluation
  • This is not numerical
  • Is more descriptive
  • May involve a criteria for success
  • Create a list of necessary features that your system needs to show to be deemed a success
  • Use a range of tests to show how the system behaves in response to stimuli
  • Try to anticipate the possible inputs to the system
  • Create a real use-case and make it the focus of your evaluation
what can go wrong2
What Can Go Wrong
  • You claim too much – remember that qualitative results give you less real evidence
  • You make sweeping statements that you don’t properly justify (avoid words like: generic, optimal , etc)
  • You cover only a small range of possible inputs
  • You create a basic prototype and try to claim that it shows much more than it does
  • You don’t do enough testing
  • You set biased tests in some way without noticing
  • Your criteria is too limited to really test your work
ad