Accurate and efficient localization with conveniently-established map is the fundamental requirement for mobile robot operation in warehouse environments. An accurate AprilTag map can be conveniently established with the help of LiDAR-based SLAM. It is true that a LiDAR-based system is usually not commercially competitive in contrast with a vision-based system, yet fortunately for warehouse applications, only a single LiDAR-based SLAM system is needed to establish an accurate AprilTag map, whereas a large amount of visual localization systems can share this established AprilTag map for their own operations. Therefore, the cost of a LiDAR-based SLAM system is actually shared by the large amount of visual localization systems, and turns to be acceptable and even negligible for practical warehouse applications. Once an accurate AprilTag map is available, visual localization is realized as recursive estimation that fuses AprilTag measurements (i.e. AprilTag detection results) and robot motion data. AprilTag measurements may be nonlinear partial measurements; this can be handled by the well-known extended Kalman filter (EKF) in the spirit of local linearization. AprilTag measurements tend to have temporal correlation as well; however, this cannot be reasonably handled by the EKF. The split covariance intersection filter (Split CIF) is adopted to handle temporal correlation among AprilTag measurements. The Split CIF (in the spirit of local linearization) can also handle AprilTag nonlinear partial measurements. The Split CIF based visual localization system incorporates a measurement adaptive mechanism to handle outliers in AprilTag measurements and adopts a dynamic initialization mechanism to address the kidnapping problem. A comparative study in real warehouse environments demonstrates the potential and advantage of the Split CIF based visual localization solution.