Hi,
Firstly, please note that Adaboost is not the same concept as boosting. Boosting it a more general idea of machine learning model.
I don't think Adaboost is to use partial of the training sample. It uses all of the sample to build weak learners and take all the weak learners together to make better classification.
I guess want you really refer to is stochastic boosting, rather than Adaboost. stochastic boosting would have a such a random sub-sampling step to train each base learner. This kind of takes advantage of the randomness to avoid overfitting and accelerate the training process. For details, you can see Friedman 1999.
Bagging and boosting are ensemble learning methods that becomes popular in recent 15 years. But typically, boosting can be more powerful than bagging. Bagging was origianlly designed to achieve variance reduction. You can view that as a specific bootstrap method. So such persepctive would be closer to what you really worry about.