Information Theory MCQ Quiz - Objective Question with Answer for Information Theory - Download Free PDF
Last updated on Jun 13, 2025
Latest Information Theory MCQ Objective Questions
Information Theory Question 1:
What is the average probability of error with two symbols 0 and 1 with source probability 0.2, 0.8 respectively transmitted over a binary asymmetric channel as given below:
Answer (Detailed Solution Below)
Information Theory Question 1 Detailed Solution
Explanation:
Average Probability of Error in a Binary Asymmetric Channel:
To solve the given problem, we need to calculate the average probability of error when transmitting two symbols, 0 and 1, over a binary asymmetric channel. The channel is characterized by different probabilities of error when transmitting 0 or 1. Let’s begin by understanding the problem and calculating the required values step-by-step.
Definitions and Notations:
- Let P(0) and P(1) be the source probabilities of symbols 0 and 1, respectively.
- Let P(error | 0) and P(error | 1) be the conditional probabilities of error when transmitting 0 and 1, respectively.
- The average probability of error, denoted as P(error), can be calculated using the formula:
P(error) = P(0) × P(error | 0) + P(1) × P(error | 1)
Given Data:
- Source probabilities: P(0) = 0.2, P(1) = 0.8
- Conditional probabilities of error: P(error | 0) = 0.1, P(error | 1) = 0.4
Step-by-Step Calculation:
P(error) = P(0) × P(error | 0) + P(1) × P(error | 1)
= (0.2 × 0.1) + (0.8 × 0.4)
(0.2 × 0.1) = 0.02
(0.8 × 0.4) = 0.32
P(error) = 0.02 + 0.32 = 0.38
- Substitute the given values into the formula for P(error):
- Perform the calculations:
- Add the results to find the average probability of error:
Final Answer: The average probability of error is 0.38.
Information Theory Question 2:
In the context of the Channel Capacity Theorem, what role does channel bandwidth play in determining the capacity of a communication channel?
Answer (Detailed Solution Below)
Information Theory Question 2 Detailed Solution
Concept
The Channel Capacity Theorem, also known as the Shannon-Hartley theorem, is a fundamental principle in telecommunications and information theory. It determines the maximum rate at which information can be transmitted over a communication channel without error, given the channel's bandwidth and the signal-to-noise ratio.
According to the Shannon-Hartley theorem, the channel capacity is directly proportional to the channel bandwidth. This means that increasing the channel bandwidth will increase the channel capacity, allowing more information to be transmitted over the channel without error.
The theorem is expressed as:
where:
is the channel capacity in bits per second (bps). is the channel bandwidth in hertz (Hz). is the average signal power. is the average noise power.
Information Theory Question 3:
For a source with M equiprobable symbols, what is the formula for calculating entropy (H)?
Answer (Detailed Solution Below)
Information Theory Question 3 Detailed Solution
Concept
Entropy is a measure of the uncertainty or randomness in a set of data or information. In the context of information theory, it quantifies the average amount of information produced by a stochastic source of data. For a source with M equiprobable symbols, the entropy (H) is calculated using the formula:
where M is the number of equiprobable symbols.
This is because each symbol has an equal probability of occurring, and the entropy measures the average amount of information per symbol. When all symbols are equiprobable, the entropy is simply the logarithm of the number of symbols to the base 2, which represents the average number of bits needed to encode each symbol.
Therefore, the correct answer is option 2.
Information Theory Question 4:
__________ is the informal networks of communication that intersect several path, circumvent rank or authority and can link organizational members in any combination or direction.
Answer (Detailed Solution Below)
Information Theory Question 4 Detailed Solution
The correct answer is Grapevine.
Key Points
- Grapevine is a casual business communication channel.
- Its name stems from the fact that it extends in all directions throughout the company, regardless of authority levels.
- The phrase "heard through the grapevine" refers to information obtained through rumors or gossip that is informal and unofficial.
- The typical interpretation is that the information was spread orally among friends or coworkers, sometimes in a private manner.
- Even if there are official channels in an organization, informal channels usually arise from interactions between members of the organization.
Important Points
- To achieve goals, managers in informational jobs create, acquire, and disseminate knowledge with staff members and superior colleagues.
- All designs, including e-learning design, should adhere to the universal design concept of the hierarchy of information.
- Combining data from disparate sources with various conceptual, contextual, and typographic representations is known as information integration (II).
Information Theory Question 5:
Noise factor of a system is defined as:
Answer (Detailed Solution Below)
Information Theory Question 5 Detailed Solution
Noise figure (NF) and noise factor (F) are measures of degradation of the signal-to-noise ratio (SNR), caused by components in a signal chain.
In dB it is given as:
(N.F)dB = [(SNR)i/p]dB – [(SNR)o/p]dB
Important Points
In terms of noise resistance NF is given as:
Req = Equivalent input resistance of the antenna
Rs = Noise resistance of the system
Top Information Theory MCQ Objective Questions
1 dB corresponds to ______ change in power level.
Answer (Detailed Solution Below)
Information Theory Question 6 Detailed Solution
Download Solution PDFThe correct option is 2
Concept:
Decibels are used to express the ratio of two power values. The decibel scale is a logarithmic scale, which is a more useful way to compare ratios than using a plain arit
hmetic scale.
The decibel (dB) is defined as 10 times the base 10 logarithm of the power ratio:
dB = 10 log(P2/P1)
To get the power ratio when the dB value is given, the formula is rearranged as:
P2/P1 = 10dB/10
So for a change of 1 dB, the power ratio would be:
10 (1/10) = 1.25892 (approx)
This is about 26% increase in power level, since 1.25892-1 = 0.25892, which is roughly 26%.
So, 1 dB corresponds to approximately a 26% increase or decrease when you are increasing or decreasing power levels respectively.
Let (X1, X2) be independent random varibales. X1 has mean 0 and variance 1, while X2 has mean 1 and variance 4. The mutual information I(X1 ; X2) between X1 and X2 in bits is_______.
Answer (Detailed Solution Below) 0
Information Theory Question 7 Detailed Solution
Download Solution PDFConcept:
Mutual information of two random variables is a measure to tell how much one random variable tells about the other.
It is mathematically defined as:
I(X1, X2) = H(X1) – H(X1/X2)
Application:
Since X1 and X2 are independent, we can write:
H(X1/X2) = H(X1)
I(X1,X2 ) = H(X1) – H(X1)
= 0
Maximum data rate of a channel for a noiseless 2-kHz binary channel is -
Answer (Detailed Solution Below)
Information Theory Question 8 Detailed Solution
Download Solution PDFThe correct option is 3
Concept:
The maximum data rate of a noiseless channel is determined by the Nyquist formula, which states that the maximum data speed (in bits per second) is 2 x Bandwidth x log2(L), where L is the number of signal levels.
For a binary channel, you only have two signal levels (0 and 1), so the log2(L) part of the formula will be log2(2) = 1
So, if you have a 2-kHz binary channel, you substitute the bandwidth value into the formula giving: 2 x 2000 Hz x 1 = 4000 bps.
An event has two possible outcomes with probability
Answer (Detailed Solution Below)
Information Theory Question 9 Detailed Solution
Download Solution PDFConcept:
Information associated with the event is “inversely” proportional to the probability of occurrence.
Entropy: The average amount of information is called the “Entropy”.
Rate of information = r.H
Calculation:
Given: r = 16 outcomes/sec
∴ Rate of information = r.H
Rs = 16 x 19/32
Rs = 19/2 or 38/4 bits/sec
The channel capacity is measured in terms of:
Answer (Detailed Solution Below)
Information Theory Question 10 Detailed Solution
Download Solution PDFChannel Capacity theorem:
It states the channel capacity C, meaning the theoretical highest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel that is subject to additive white Gaussian noise (AWGN) of power N.
The capacity of a band-limited AWGN channel is given by the formula:
C = Maximum achievable data rate with units of bits/sec for continuous nature of input and output. For discrete nature of input/output, bits/channel is used.
B = channel bandwidth
Note: In the expression of channel capacity, S/N is expressed in Watts and not in dB.
A (7, 4) block code has a generator matrix as shown.
If there is error in the 7th Bit then syndrome for the same will be
Answer (Detailed Solution Below)
Information Theory Question 11 Detailed Solution
Download Solution PDFThe generator Matrix is given by
The parity check matrix is given by:
H = [P Ikn – K]
Syndrome
S = eHT
S = eHT
For error in 7th Bit
E = [000 0001]
S = [ 0 0 1]
Extra information:
Syndrome for all possible errors
Error Pattern |
Syndrome |
0000000 |
000 |
0000001 |
001 |
0000010 |
010 |
0000100 |
100 |
0001000 |
101 |
0010000 |
111 |
0100000 |
011 |
1000000 |
110 |
________ is the average amount of information that must be delivered in order to resolve the uncertainty about the outcome of a trial.
Answer (Detailed Solution Below)
Information Theory Question 12 Detailed Solution
Download Solution PDFEntropy: It is the average amount of information that must be delivered in order to resolve the uncertainty about the outcome of a trial.
Or, The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.
It is calculated as:
pi is the probability of the occurrence of a symbol.
The number of outcomes = M.
Notes:
Bandwidth: The bandwidth of a communication channel is defined as the difference between the highest and lowest frequencies that the channel allows passing through it (i.e., its passband).
Quantum: A quantum is defined as the minimum amount of any physical entity involved in an interaction. For example, a photon is a single quantum of light.
If the probability of a message is 1/4, then the information in bits is:
Answer (Detailed Solution Below)
Information Theory Question 13 Detailed Solution
Download Solution PDFConcept:
Information associated with the event is “inversely” proportional to the probability of occurrence.
Mathematically, this is defined as:
P and I represent the probability and information associated with the event
Calculation:
With P = 1/4, the information associated with it will be:
I = log2 (22) bits
Since logx yn = n logx y, the above can be written as:
I = 2 log2(2)
I = 2 bits
Entropy: The average amount of information is called the “Entropy". It is given by:
Information is:
Answer (Detailed Solution Below)
Information Theory Question 14 Detailed Solution
Download Solution PDFConcept:
Information associated with the event is “inversely” proportional to the probability of occurrence.
Mathematically, this is defined as:
P and I represent the probability and information associated with the event
Example:
Sun rises in the east: The probability of this event to occur = 1, and the information associated with it will be = 0.
Sun rises in the west: P = 0 and I = ∞
Entropy: The average amount of information is called the “Entropy". It is given by:
Noise factor of a system is defined as:
Answer (Detailed Solution Below)
Information Theory Question 15 Detailed Solution
Download Solution PDFNoise figure (NF) and noise factor (F) are measures of degradation of the signal-to-noise ratio (SNR), caused by components in a signal chain.
In dB it is given as:
(N.F)dB = [(SNR)i/p]dB – [(SNR)o/p]dB
Important Points
In terms of noise resistance NF is given as:
Req = Equivalent input resistance of the antenna
Rs = Noise resistance of the system