Science and Technology

Everything You Need To Know About Digital Imaging

5
(1)

Digital imaging is the process of getting images of the body using x-rays, showing them digitally, viewing, and storing them on a computer monitor.

In the past era, the image of the body part used to appear on the sheet of film with the old conventional processing. Radiologists took the film and placed it on the illuminator for interpretation. After interpretation, they stored the film in a paper envelope and sent it to the physician or filed it in a radiology file room. They noticed that a lot of manual work was compulsory for the processing of the x-ray image.

Now, digital imaging is gradually replacing film imaging in radiography. It has many advantages; the workflow is faster and it allows image processing to optimize the clinical information from an image.

Digital imaging transforms optical image into an array of numerical data.
A radiologist interpreting an x-ray image in digital imaging

When this process was introduced first?

Digital imaging in radiology was first introduced in the early 1970s by Godfrey Hounsfield of England with the introduction of the computed tomography (CT) scanner. Since then, several other areas of radiology, like conventional radiology and ultrasound, have been converted to digital imaging.

Advantages of digital systems

Digital imaging exhibits a wide range of advantages when compared with conventional film/screen imaging. A few advantages include:

  • Presence of soft-copy reporting
  • Separate processes of the acquisition and display
  • The increased latitude and dynamic range
  • Images can access simultaneously at any workstation
  • Set up of the viewing systems in any location
  • The system can use digital image archives instead of the film libraries
  • Resulting images are generally quicker to retrieve and less likely to move back to the operator
  • Post-processed of the images to aid visualization of anatomy and pathology
  • No requirements of manual handling of cassettes for the direct digital radiography (DDR) systems
  • Potential patient dose reduction
  • Potential lower running costs
  • No need of handling of the processing chemicals

What is Image Acquisition?

Image acquisition is the major step in any image processing system. Its general aim is to transform an optical image (real-world data) into an array of numerical data which may display later on a computer.

A large number of technologies usually involve digital imaging acquisition in planar radiography, such as

  • DDR devices
  • CR scanners (imaging with photostimulable or storage phosphors)

DDR (Dynamic Digital Radiology)

DDR is an advanced form of X-ray technology. It procures a series of individual digital images acquired at high speed and low doses.

A DDR system includes more changes in the X-ray couch and vertical Bucky design. It also entails changes to the X-ray tube assembly. Unlike the removable CR cassette, DR plate or detector fully integrates into the exposure equipment. The radiographed image of the patient appears on the acquisition workstation in a few seconds. Here, the image can undergo optimization. It will be available for reporting or repeating measures.

Uses:

Uses of DDR may include:

  • General radiography and mobile radiography
  • Small-field mammography
  • Radiologists now use DDR detectors instead of image intensifiers in fluoroscopy.

Computed radiography (CR)

Computed radiography (CR) is a type of digital imaging system, introduced in the United States in 1983 by Fuji Medical Systems of Japan. CR usually refers to cassette-based digital imaging as the image of the body part obtains by using a cassette that contains a storage phosphor plate. CR has multiple uses in almost all areas where film/screen systems are currently in operation, including mammography.

What are some Digital radiography technologies?

The more general detector technologies used in digital radiography are:

  • X-ray scintillator: It may bind to a coupled to a charge-coupled device (CCD) and a read-out array (amorphous silicon photodiode/thin-film transistor (TFT) array).
  • X-ray detector: It is usually of amorphous selenium bonded to a TFT read-out array.

Both these technologies can undergo construction in the form of a flat pane. The scintillator transforms the X-rays into light output. They are mostly coupled to an amorphous silicon photodiode TFT flat-panel read-out array directly. The light coming from the scintillator converts into electrical charge in a photodiode array. Charge stores the energy until it reads out from each of the pixels. These are commonly known as amorphous silicon systems.

What is Scanning technology?

Scanning technology is an alternative detection method to cover the full image area. A linear array of detectors helps to scan across the patient. This method usually results in good contrast differentiation and scatter rejection. But it has several disadvantages, such as a long exposure time and high tube loading. Additionally, the alignment of the scanning radiation beam requires tight mechanical stability of the scanning mechanism.

What Factors can affect the image quality?

Both inherent in equipment design and external, many factors can affect the image quality. These factors are:

1. Fill factor

A proportion of the detector includes the read-out circuitry for the flat-panel detector. It will be insensitive to the coming light photons or electrons leading to the concept of the fill factor. It is the ratio of the sensitive area of the pixel to the effective area of the detector element. Any changes in resolution will require a reduced pixel pitch. The fill factor can decrease with improved resolution, because the read-out electronics may lower the detector sensitivity.

2. Tiling

A tiled array includes several detectors abutted together to sample the whole image. However, there are small areas on read-out devices that are not sensitive. This may happen because of the gaps between the detectors (typically about 100m).

3. Grids

Low grid strip densities may result in interference patterns in the image. The problem gets solved by using moving grids or high-density grids of more than 60 lines per cm. When using CR, the grid lines must be perpendicular to the scan lines in the reader

4. Automatic exposure control response

An automatic exposure control (AEC) for a film/screen system ensures the correct optical density achieved across a range of kilo voltages. This approach is not practical for digital imaging. The AEC needs to be set up in collaboration with the radiology and medical physics departments. The level of exposure should be under optimization for the selected examination. One other consideration is that when film systems get replaced by a CR system, then the AEC keeps at the same settings. This actually cannot be the optimal working level. Because the energy response of the digital system is different from those of the film system it replaces.

5. Bit depth/image size

A pixel is the smallest term of a digitized picture and is the distance between the centers of the adjacent pixels. The bit depth of the image finds the contrast resolution. The analog value of the output from each pixel may convert to a digital form. Thus, the results may store at a separate location in a matrix.

6. Image optimization

Image quality relates to the radiation exposure attained by the detector. Although a relatively low exposure can make an image noisy, it may still contain sufficient information to be acceptable diagnostically. High exposure can improve the image quality, because of the reduction of quantum noise. However, image quality improvement is never linear. It may eventually level off as the quantum noise decreases and plate becomes overexposed.

HIS and RIS

Hospital information systems (HIS) and radiology information systems (RIS) include the details of patient and examination information. When the digital acquisition system connects to a HIS/RIS, the workflow of a department increases by using the RIS, thereby improving patient throughput. By automatically attaching patient demographic and examination details from the RIS to the image, images deliver much faster. HIS/RIS systems mostly use the Health Level 7 (HL7) standard for the transfer of patient details.

DICOM

Digital Imaging and Communications in Medicine (DICOM) is a protocol-based standard that facilitates the transfer of digital images. When buying a system, a DICOM conformance statement receives to inform how the device and software conform to the standard for its particular function. If a modality cannot create images in DICOM format, it can never upgrade to DICOM.

PACS

Picture Archive and Communications System (PACS) is an image communication system. It delivers images and information around the system. Patient’s information from the HIS/RIS system routes correctly and retrieves image. Images view at various points in the system and their manipulation depends on the type and function of the equipment. Typical viewing components are acquisition, reporting, and viewing workstations, monitors, and laser printers.

Related: 5 Uses of Optical Fibers in Medicine You Must Know

How useful was this post?

Click on a star to rate it!

Average rating 5 / 5. Vote count: 1

No votes so far! Be the first to rate this post.

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Click to comment

You must be logged in to post a comment Login

Leave a Reply

Most Popular

To Top