SQLite Forensics Book, now available on Amazon

More information here


Blog Comments

  1. Paul's Avatar
    Hi Magis

    I removed access to the download because the program has not been updated since 2010. I was not aware that SANS referred to it on any of their courses
  2. Magis's Avatar
    Hello Paul. MFTView was referenced in SANS FOR508 as a good tool for parsing $FILE_NAME timestamps. I can't access the download. Is the tool still available and supported through ongoing development? I can see the posts are quite a few years old so I'm thinking probably not?

    Thank you for your ongoing support of the forensics community.
  3. Paul's Avatar
    Thanks Harlan - My point re the deleted keys was that you dont know what you dont know. Checking for signatures is obvious, but I am still always wondering whether I am still missing something...
  4. keydet89's Avatar

    Thanks for the review. I think that it's extremely useful to get not only input and insight from folks who actively work in the industry regarding the content (particularly so that others can see), but to also get their view of things such as the approach taken in writing style, etc. I've tried to get that sort of thing beforehand and it doesn't work well...so I have to write something, get folks to purchase and read it, and then get their input/insight to use on the next project.

    I'd like to comment on some of the things you mentioned in your post. First, I agree with you regarding the quality of the images...in my mind, many of them are simply too large in the print version of the book, and appear to be of poor quality. That's somewhat distressing, considering the work that went into the process of providing TIF format images.

    I see your point about the list of keys/values and how that might be useful, and it's something I can work toward in my next project. At the same time, I'd offer this idea to readers...take notes. If you find something of interest, put together your own list. After all, that's where many of us have started.

    As to your comment, "...then states that “this, along with some other checks is how deleted keys can be located” – what other checks?"", yes, I can see what you're saying there. I had assumed that some would be intuitively obvious...such as checking for the right signature ("nk"), a valid date, that sort of thing...but I can see how that would be something that could be missed.

    Thanks again for writing and posting your thoughts, as I greatly appreciate it.
  5. GlosSteveC's Avatar
    I have encountered this scenario on the very rare occasion and to be fair I have almost always been told by the shamefaced officer what they have done.
    I just build the information into the analysis and deal with it.
    It is exactly the same as if the OIC has touched an exhibit - that is why they take elimination fingerprints from policemen.
    The "war story" relates to a murder, although it was still a missing person enquiry at the time, and I had to unpick the honest and concerned police officer's actions - he thought if he could find the big clue he might save the victim. Sadly he was mistaken and I had a bit of a job doing the analysis afterwards - but my report was eventually produced as evidence and accepted with nothing more than a quizzical remark by the Judge who immediately accepted the that Officer's actions were honest and well meaning.
  6. Paul's Avatar
    That is an excellent article Ian and as said on FF I wish I had seen it before I wrote mine. It really highlights exactly what I was talking about, particularly relevant when talking about the press who quote "authoratitive sources" but yet get it wrong so often.

    The difference though is that in your case the article was posted for the world to see and was regurgitated verbatim without any work as to the reliability of the source (and in fact from a source that is known to be often unreliable). Whereas in our field any information picked up from a forum and used will be in a report that a handful of people, mainly lawyers, will see and who are unlikely to validate it - and why should they thats why they employ the experts, us.
  7. IanF's Avatar

    This blog reminds me of a "Social Media Experiment" that was carried out by a student here in Dublin last year.

    Have a read here for more info - http://www.guardian.co.uk/commentisf...ane-fitzgerald

    It just reinforces the requirement to have a verifiable source of information, rather than just re-hashing something that some other random person has written.
  8. Paul's Avatar
    Thanks Athulin

    I agree in part that the is an onus on the software developer to show where the data comes from and I have already taken this to heart with LinkAlyzer and PmExplorer, and to some extent RevEnge. This clearly cant be done for all softwrae but we should be striving for it.

    However I think and equal emphasis should be on users providing feedback, I get little feedback from users about issues with the software and when i do it is often very poor and gioves no clue as to what set of events caused a particular issue. Some of course is good and I am working hard on a rewrite for one of my tools at the moment to fix a seemingly simple bug that actually requires some major changes to the way the code works.
  9. athulin's Avatar
    The Scientific Working Group on Digital Evidence (SWGDE) has a (draft?) document on 'Minimum Requirements for Quality Assurance in the Processing of Digital and Multimedia Evidence', which comes very close to this question. It must be noted that it takes the term 'forensic' quite literally, and also that it is more targeted to laboratories than individuals.

    For software tools in general, though, it seems to be less appreciated that in order to validate a tool, the tool really has to be designed to allow validation -- and that takes some sophistication on the part of the tool maker. A tool that just spits out 'Latest IP address assigned =', without providing full traceability of that information (either in the report, or in the documentation) cannot easily be validated.

    Comparing output with another tool is not really good enough, unless a) both tools provide such traceability, and b) at least one of them have been validated to actually get the information from that source (and not just claiming to). But when it comes down to that kind of detail, validation is moving more and more into the area of software engineering and software quality assurance. I would not expected the average forensic analysts to move willingly into that field, or, when they do, do that kind of job with the same proficiency. This is to a great extent the domain of the tool maker

    Part of the onus to do validation must be placed on the tool maker, as part of the quality assurance their customers/users presumably require, without in the least lessening the requirement that the analyst be resonsible for the analysis. And part of the task of the CF field should be to insist on such requirements being met, and even document failure to do so. With such documentation in hand, thse whispers may become 'Validate -- here's what might happen!'.
  10. Paul's Avatar
    Thanks for the feedback - not yet, but this sort of feedback is exactly what I want to see for all of my software.

    MFTView is a freebie though and so development is as and when Ican afford the time. But your observations are important and these features should be included soon.
  11. jpascoe's Avatar
    Looks great! Will it also show if the data is resident or non-resident? Will it identify the clusters/data runs for each file?
  12. Paul's Avatar
    I am actively working on this at the moment and I am incorporating a LinkAlyzer/PmExplorer type hex view that will allow you to step through every single byte of an MFT entry and see where in the raw hex the data is stored.

    Watch this space.