How does a video card drive a display to an output device (e.g., monitor/TV)?
Explain not on the circuit level, but on the layman level.
We know, developers as some of us, that the card uses memory to keep track of video data. But how does that "video data" end up on a screen?
What are the steps, exactly? Thanks.
Explain not on the circuit level, but on the layman level.
We know, developers as some of us, that the card uses memory to keep track of video data. But how does that "video data" end up on a screen?
What are the steps, exactly? Thanks.
No comments:
Post a Comment