Active learning is a subfield of machine learning, in which the learning algorithm is allowed to choose the data from which it learns. In some cases, it has been shown that active learning can yield an exponential gain in the number of samples the algorithm needs to see, in order to reach generalization error $\leq \epsilon$. In this work we study the problem of learning halfspaces with membership queries. In the membership query scenario, we allow the learning algorithm to ask for the label of every sample in the input space. We suggest a new algorithm for this problem, and prove it achieves a near optimal label complexity in some cases. We also show that the algorithm works well in practice, and significantly outperforms uncertainty sampling.