SQLite Forensics Book, now available on Amazon

More information here


Pauls blogs/ramblings

Are we gullible or just naive?

Rate this Entry
It never fails to amaze me how many computer forensics investigators are happy to just regurgitate something they have read on a forensics forum or on the Internet in general. While the Internet is obviously a great source of information we do appreciate, don't we, that it is populated by the well meaning but sometimes ill informed.

It doesn't take you long to find a thread on a computer forensics forum (this includes those forums that are closed to the public) where someone with a problematic evidential hard disk is told to put it in the fridge, or swap the circuit board. While both of these are possible solutions in limited circumstances they both have downsides and these downsides are rarely discussed, or at least not early in the thread.

Mostly of course the posters are well meaning and are offering what they believe is good advice but it is advice that might not be best in a particular situation and could in extremis be devastating.

It is also often the case that a poster on a given forensic forum asks for advice about an artifact found with forensic tool x and the first bit of advice offered is validate with tool y - why aren't we doing this as a matter of course. This is especially relevant to software that decodes a closed file format but also to those that decode a more complex open file format, as a software engineer I would be the first to admit that we don't always get it right first time.

This of course is good advice and perhaps should be in the signature of all good forensic investigators, but is it advice that should not need to be offered.

In posts of a similar nature to this article I have had the argument "well we have to start somewhere". But is this acceptable, we are after all playing with peoples freedom, perhaps the time is here for professional recognition but this is a can of worms that can be dealt with elsewhere.

The issue for me though is what picture does this paint of the mindset of the person offering the advice? Ask investigators we should have inquiring minds and we should also take anything we read with a pinch of salt. We would just take the email address of an received email as gospel but what about when we ask for help on the Internet (and most of us will do this at some time as none of us know everything - do we?) do we just take the only answer as gospel? Or do we sit down and test the solution? I would like to think the latter but reading between the lines on the various Internet forums (I hate the phrase fora) this does not seem to be the case.

I really don't know what the answer is, should we be forced to issue a disclaimer when offering advice that the user should test the advice given? - not practicable and certainly not enforceable; should we have a points system were all advice or advisors are rated - not practical and could rule out good advice from "newbies" or those wanting to remain anonymous; or do we just go on as we are with those of us who know (or think we know) what we are doing, validating any advice offered and letting those on the other side of the fence just get on with it.

The answer probably lies somewhere in between and I guess we need to accept that this is a rapidly growing field which at one level is easy to get in to (buy a dongle or download some open source software and claim you have been practicing for 10 years) and that standards will differ. The onus is then on the rest of us, and by this I mean the majority, to continually sit at the side lines whispering "validate, validate".

Or perhaps we just need to remember that as investigators are job is not simply to understand the evidence on the computers in the case we are currently investigating but to understand, in depth, what the tools we are using are showing.

On the plus side at least the posters alluded to above are asking questions, perhaps it is those who we don't hear from that we should be worried about...

Submit "Are we gullible or just naive?" to Facebook Submit "Are we gullible or just naive?" to Twitter Submit "Are we gullible or just naive?" to Digg Submit "Are we gullible or just naive?" to del.icio.us Submit "Are we gullible or just naive?" to StumbleUpon Submit "Are we gullible or just naive?" to Google

Tags: None Add / Edit Tags
General Articles


  1. athulin's Avatar
    The Scientific Working Group on Digital Evidence (SWGDE) has a (draft?) document on 'Minimum Requirements for Quality Assurance in the Processing of Digital and Multimedia Evidence', which comes very close to this question. It must be noted that it takes the term 'forensic' quite literally, and also that it is more targeted to laboratories than individuals.

    For software tools in general, though, it seems to be less appreciated that in order to validate a tool, the tool really has to be designed to allow validation -- and that takes some sophistication on the part of the tool maker. A tool that just spits out 'Latest IP address assigned =', without providing full traceability of that information (either in the report, or in the documentation) cannot easily be validated.

    Comparing output with another tool is not really good enough, unless a) both tools provide such traceability, and b) at least one of them have been validated to actually get the information from that source (and not just claiming to). But when it comes down to that kind of detail, validation is moving more and more into the area of software engineering and software quality assurance. I would not expected the average forensic analysts to move willingly into that field, or, when they do, do that kind of job with the same proficiency. This is to a great extent the domain of the tool maker

    Part of the onus to do validation must be placed on the tool maker, as part of the quality assurance their customers/users presumably require, without in the least lessening the requirement that the analyst be resonsible for the analysis. And part of the task of the CF field should be to insist on such requirements being met, and even document failure to do so. With such documentation in hand, thse whispers may become 'Validate -- here's what might happen!'.
  2. Paul's Avatar
    Thanks Athulin

    I agree in part that the is an onus on the software developer to show where the data comes from and I have already taken this to heart with LinkAlyzer and PmExplorer, and to some extent RevEnge. This clearly cant be done for all softwrae but we should be striving for it.

    However I think and equal emphasis should be on users providing feedback, I get little feedback from users about issues with the software and when i do it is often very poor and gioves no clue as to what set of events caused a particular issue. Some of course is good and I am working hard on a rewrite for one of my tools at the moment to fix a seemingly simple bug that actually requires some major changes to the way the code works.
  3. IanF's Avatar

    This blog reminds me of a "Social Media Experiment" that was carried out by a student here in Dublin last year.

    Have a read here for more info - http://www.guardian.co.uk/commentisf...ane-fitzgerald

    It just reinforces the requirement to have a verifiable source of information, rather than just re-hashing something that some other random person has written.
  4. Paul's Avatar
    That is an excellent article Ian and as said on FF I wish I had seen it before I wrote mine. It really highlights exactly what I was talking about, particularly relevant when talking about the press who quote "authoratitive sources" but yet get it wrong so often.

    The difference though is that in your case the article was posted for the world to see and was regurgitated verbatim without any work as to the reliability of the source (and in fact from a source that is known to be often unreliable). Whereas in our field any information picked up from a forum and used will be in a report that a handful of people, mainly lawyers, will see and who are unlikely to validate it - and why should they thats why they employ the experts, us.