Did the acceptance testing of our first digital radiography unit a couple of days ago, a GE Revolution XR/d.
As imaging technology changes, I usually have to adapt my testing methods to fit. Some technologies render certain tests obsolete or irrelevant while other tests need to be modified, or the analysis changed. In the past, I've had to modify my test and analysis procedures for CR units and more recently multi-detector CT scanners.
This new DR unit was no exception. Being a digital unit, a few things went a lttle faster and easier. Images pop up within 15 seconds of the exposure, so a lot of time gets cut out waiting for the images to appear. The table detector is electronically coupled to the location of the tube and slides along as you move the tube along the table, so no need to fuss with centering the tube over the detector. The folks at GE were even kind enough to incorporate a patient entrance dose display and cumulative exposure counter on the workstation. For some reason though, they've apparently decided to forego any kind of exposure index indicator - some kind of indicator to the technologist that the x-ray exposure they just made falls within an acceptable range for image quality. At least there wasn't one that I could find or that the service engineer knew about.
An exposure index is a very useful tool for providing feedback to the technologist. With conventional film/screen, the tech can easily see whether the exposure was too much or too little by how dark the developed film comes out. With digital imaging, there's no relationship between the appearance of the image and exposure adequacy except in the appearance of noise. Almost all CR manufacturers have some form of exposure index that's displayed to the tech. I'm puzzled as to why this GE DR unit doesn't have anything. Maybe I'll just have to dig deeper to find it.
The first problem this caused was just how to test the kV and thickness tracking for the phototimer. The phototimer is responsible for making sure the image receptor (film, CR cassette, DR receptor) gets enough radiation to produce an adequate image. For film, you measure the optical density (OD). With CR, I use whatever exposure index is provided by the CR vendor as an analog for OD. With this GE DR unit, there wasn't anything immediately obvious to use. So after a bit of mucking around with the software to see what I could find, I eventually ended up using a central region of interest to get the mean pixel value from the raw unprocessed image and tracking that value. Everything seemed to come out ok, although I have no feel for what an acceptable range would be. Something I'll have to work out I suppose. In the meantime, this lack of any kind of exposure index seems like a potentially serious issue as far as providing feedback to the technologist.
The other new thing that needs to be done is the detector evaluation. These detectors need to be properly calibrated, and I'll probably have to include procedures for verifying the calibration. Somewhere in the world of AAPM subcomittees and task groups, there was one putting out a report on testing CR and DR units, which is something I've been waiting a while for and is just what I need for this task. I didn't see it on the list of active task groups, and last I heard the final report was coming RSN, so hopefully I'll see something soon.
We've got a couple more DR rooms being installed in the next few months (hopefully), so I'll have a chance to try out some new procedures in a little while.