To determine whether it is going to spider your "new / changed content", the servers response to the last update of each file is used. RSS will increase the crawl rate, as it would need to crawl your site before being aware of any updates.
This discussion can get very deep, but to all intents and purposes, if you wish to be crawled regularly, it needs to appear that the content had a "hard update".
What about Content Management Systems? Well this is where the XML sitemap can provide some serious benefits. Although no files are directly updated, you make google aware of the frequency of change, and ping where necessary. It's not a good idea to ping too frequently, but is acceptable to run it on a cron if you do have regularly updated content.
This technique can be applied to your pages with nested RSS, but i would strongly suggest against it.
Content is King, and should be unique and informative.