KEYWORDS: Digital signal processing, Calibration, Artificial intelligence, Analog electronics, Robotic surgery, Quadrature amplitude modulation, Orthogonal frequency division multiplexing, Modulation, Evolutionary algorithms, Detection and tracking algorithms
With the advent of the 5G wireless networks, achieving tens of gigabits per second throughputs and low latency has become a reality. This level of performance will fuel numerous realtime applications where the computationally heavy tasks can be performed in the cloud. The increase in the bandwidth along with the use of dense constellations places a significant burden on the speed and accuracy of analog-to-digital converters (ADC). A popular approach to create wideband ADCs is utilizing multiple channels each operating at a lower speed in the time-interleaved pattern. However, an interleaved ADC comes with its own set of challenges. The parallel architecture is very sensitive to the inter-channel mismatch, timing jitter, clock skew between different ADC channels as well as the nonlinearity within individual channels. In this project, we utilize a deep learning algorithm to learn the complete and complex ADC behavior and to compensate for it.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.